Debug Information
Processing Details
- VTT File: mss541_Madeleine_Beekman_2025_09_02.vtt
- Processing Time: September 10, 2025 at 08:58 AM
- Total Chunks: 2
- Transcript Length: 92,875 characters
- Caption Count: 837 captions
Prompts Used
Prompt 1: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 1 of 2 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
[00:00:00.480 --> 00:00:04.560] Ever notice how ads always pop up at the worst moments?
[00:00:04.560 --> 00:00:08.640] When the killer's identity is about to be revealed.
[00:00:08.640 --> 00:00:12.320] During that perfect meditation flow.
[00:00:12.320 --> 00:00:16.240] On Amazon Music, we believe in keeping you in the moment.
[00:00:16.240 --> 00:00:26.080] That's why we've got millions of ad-free podcast episodes so you can stay completely immersed in every story, every reveal, every breath.
[00:00:26.080 --> 00:00:33.120] Download the Amazon Music app and start listening to your favorite podcasts ad-free, included with Prime.
[00:00:33.760 --> 00:00:38.800] FanDuel's getting you ready for the NFL season with an offer you won't want to miss.
[00:00:38.800 --> 00:00:44.640] New customers can bet just $5 and get $300 in bonus points if you win.
[00:00:44.640 --> 00:00:45.360] That's right.
[00:00:45.360 --> 00:00:53.440] Pick a bet, bet $5, and if it wins, you'll unlock $300 in bonus bets to use all across the app.
[00:00:53.440 --> 00:01:00.240] You can build parlays, bet player props, ride the live lines, whatever your style, FanDuel has you covered.
[00:01:00.240 --> 00:01:02.240] The NFL season is almost here.
[00:01:02.240 --> 00:01:04.880] The only question is: are you ready to play?
[00:01:04.880 --> 00:01:10.640] Visit FanDuel.com slash bet big to download the FanDuel app today and get started.
[00:01:10.640 --> 00:01:19.600] Must be 21 plus and present in select states for Kansas in affiliation with Kansas Star Casino or 18 plus and present in DC, Kentucky, or Wyoming.
[00:01:19.600 --> 00:01:21.680] First online real money wager only.
[00:01:21.680 --> 00:01:23.520] $5 first deposit required.
[00:01:23.520 --> 00:01:28.160] Bonus issued as non-withdrawable bonus bets, which expire seven days after receipt.
[00:01:28.160 --> 00:01:29.280] Restrictions apply.
[00:01:29.280 --> 00:01:32.160] See terms at sportsbook.fanuel.com.
[00:01:32.160 --> 00:01:33.120] Gambling problem?
[00:01:33.120 --> 00:01:37.040] Call 1-800-GAMBLER or visit fanduel.com slash RG.
[00:01:37.040 --> 00:01:44.080] Call 1-888-789-7777 or visit cpg.org slash chat in Connecticut.
[00:01:44.080 --> 00:01:47.040] Or visit mdgamblinghelp.org in Maryland.
[00:01:47.040 --> 00:01:47.760] Hope is here.
[00:01:47.760 --> 00:01:50.160] Visit gamblinghelpline ma.org.
[00:01:50.160 --> 00:01:55.440] Or call 800-327-5050 for 24-7 support in Massachusetts.
[00:01:55.440 --> 00:02:01.240] Or call 1-877-8 HOPE-N-Y or text H-O-P-E-N-Y in New York.
[00:02:04.760 --> 00:02:10.280] You're listening to The Michael Shermer Show.
[00:02:16.680 --> 00:02:19.240] All right, so your book, The Origin of Language.
[00:02:19.240 --> 00:02:31.240] I love this subject because I love origin questions: the origin of the universe, the origins of life, the origins of complex life, the origins of humanity, the origins of language, morality.
[00:02:31.240 --> 00:02:32.120] That's pretty much it.
[00:02:32.120 --> 00:02:34.920] Those are the big questions that I think scientists should tackle.
[00:02:34.920 --> 00:02:37.000] So, you know, good on you for taking this on.
[00:02:37.000 --> 00:02:38.120] Give us a little bit of background.
[00:02:38.520 --> 00:02:39.240] What's your story?
[00:02:39.240 --> 00:02:40.680] How'd you get into this?
[00:02:41.320 --> 00:02:42.120] Well, it's interesting.
[00:02:42.360 --> 00:02:53.720] So I'm an experimental biologist, and well, as you mentioned, I got a fellowship at the Institute of Advanced Studies in Berlin, which is just a think tank, basically.
[00:02:53.880 --> 00:02:56.440] You can't do any experimental work.
[00:02:56.760 --> 00:02:59.080] So I decided I was going to write a book.
[00:02:59.080 --> 00:03:03.000] And I had this very ambitious project in mind.
[00:03:03.000 --> 00:03:09.080] And the book was going to be called The What Was it called?
[00:03:09.880 --> 00:03:10.680] It doesn't matter.
[00:03:10.680 --> 00:03:11.960] Oh, How Evolution Works.
[00:03:11.960 --> 00:03:13.000] Yes, How Evolution Works.
[00:03:13.720 --> 00:03:17.880] And of course, I didn't know that Ernst Meyer had already written a book with that title.
[00:03:18.040 --> 00:03:22.760] Of course, he's a much more famous evolutionary biologist than I will ever be.
[00:03:23.000 --> 00:03:29.480] But at the same time, I also decided that I was getting a bit tired of writing academic papers.
[00:03:29.480 --> 00:03:34.280] You know, you write a paper and about five people read them, and that's about it.
[00:03:34.280 --> 00:03:39.640] And I thought it would be nice to try and write for a more general audience.
[00:03:39.640 --> 00:03:50.640] But the main reason why I wanted to write the book How Evolution Works is because I always looked at organisms as, you know, from sort of a Darwin perspective.
[00:03:50.880 --> 00:03:55.040] You look at the individual, you try and understand why they behave and the way they behave.
[00:03:55.200 --> 00:04:00.800] Always worked on insects, mainly honeybees and ants.
[00:04:01.120 --> 00:04:15.840] And I thought, well, you know, everyone's talking about the genomes that, of course, the Human Genome Project had come out, which was going to reveal everything that's special about humans, which, of course, it didn't.
[00:04:16.160 --> 00:04:21.120] And then I realized, well, I don't actually understand how you marry the two levels.
[00:04:21.120 --> 00:04:28.880] So you look, natural selection works at the level of the individual, but of course, the changes happen at the level of the genome.
[00:04:28.880 --> 00:04:38.640] And since I'm not a geneticist, and to be honest, I never had any interest in genetics or genomics whatsoever, I thought, well, this might be a good challenge.
[00:04:38.640 --> 00:04:43.680] So let's try and write a book that explains how evolution works.
[00:04:44.000 --> 00:04:46.800] But that's quite a big question.
[00:04:46.800 --> 00:04:50.000] I mean, if you look at Ansmeyer's book, it basically talks about everything.
[00:04:50.000 --> 00:04:53.520] And it's more an academic book, which I didn't want to write.
[00:04:53.520 --> 00:04:58.320] So I thought, well, what can I do to make it interesting to a general audience?
[00:04:58.320 --> 00:04:59.760] Well, what are people interested in?
[00:04:59.760 --> 00:05:02.400] They're interested in humans.
[00:05:02.720 --> 00:05:10.400] I had no interest in human evolution whatsoever, never thought about it, never read anything about it.
[00:05:10.400 --> 00:05:15.680] And then I realized, well, actually, it's amazingly interesting.
[00:05:15.680 --> 00:05:34.680] And then I bumped into all these different little changes, if you wish, because the other angle that I started to take was: if you look at chimpanzees and you look at humans, our genomes are, depending on how you measure it, 98% the same.
[00:05:35.000 --> 00:05:44.040] So, if genes are so important, how can we be genetically so identical and phenotypically and behaviorally, etc., so different?
[00:05:44.040 --> 00:05:47.240] So, that's basically where I started reading.
[00:05:47.240 --> 00:05:55.560] And then I found all these funny mistakes, if you wish, that made us who we are.
[00:05:55.880 --> 00:06:01.800] And then I stumbled on all the morphological changes that allowed us to speak.
[00:06:01.800 --> 00:06:11.960] And then I realized, well, natural selection works on increasing your reproductive output.
[00:06:12.280 --> 00:06:20.040] The way our morphology changed that allowed us to speak coincided with producing those babies, which are really premature.
[00:06:20.040 --> 00:06:35.480] So, putting one and one together, I thought, well, that must be the origin of language to raise babies that are so premature because we have such a large head, which at the same time caused a problem and also gave us the solution.
[00:06:35.480 --> 00:06:36.120] Nice.
[00:06:36.120 --> 00:06:36.680] Wow.
[00:06:36.680 --> 00:06:38.440] That's a perfect introduction.
[00:06:38.600 --> 00:06:48.280] Let me tease this up for you reading the opening paragraphs of my very first column in Scientific American, which I called Darwin's Dictum.
[00:06:48.280 --> 00:07:05.160] In 1861, less than two years after the publication of Charles Darwin's On the Origin of Species, in a session before the British Association for the Advancement of Science, a critic claimed that Darwin's book was too theoretical, and that he should have just put the facts before us and let them rest.
[00:07:05.160 --> 00:07:12.520] In a letter to his friend Henry Fawcett, who was in attendance in his defense, Darwin explained the proper relationship between facts and theory.
[00:07:12.520 --> 00:07:19.520] Here's Darwin: About 30 years ago, there was much talk that geologists ought only to observe and not theorize.
[00:07:19.520 --> 00:07:27.280] And I well remember someone saying that at this rate, a man might as well go into a gravel pit and count the pebbles and describe the colors.
[00:07:27.600 --> 00:07:37.200] How odd it is that anyone should not see that all observation must be for or against some view if it is to be of any service.
[00:07:37.520 --> 00:07:39.120] So I'm teeing this up for you.
[00:07:39.120 --> 00:07:43.280] What is your theory of language for or against?
[00:07:43.920 --> 00:07:48.160] Who has the other theories and what are they and how is yours different?
[00:07:48.800 --> 00:07:57.680] Well, that's interesting because as far as I know, no one came up with a theory for why, no, for how language came about.
[00:07:58.320 --> 00:08:07.840] So I mentioned the mating minds, the Jeff Miller's the mating minds, that the brain is basically a sexually selected organ.
[00:08:08.160 --> 00:08:16.160] It allows men in particular to be really clever and witty and to write poetry and novels, etc.
[00:08:16.480 --> 00:08:20.880] And that's all to do with seducing women, basically.
[00:08:21.520 --> 00:08:29.760] And of course, women also have to have a brain large enough to understand all the jokes because there's nothing worse than making a joke and no one's laughing.
[00:08:29.760 --> 00:08:31.600] So that's the mating mind.
[00:08:32.720 --> 00:08:36.080] Then, of course, Harari has written three books.
[00:08:36.080 --> 00:08:38.080] I haven't read the third one yet.
[00:08:38.400 --> 00:08:48.400] In which he explains human society, which is all based around fictions, fictions that we come up with that we all believe in.
[00:08:48.400 --> 00:08:54.720] And because we all believe in it, society becomes a sort of a cohesive entity.
[00:08:54.720 --> 00:09:04.760] And that, of course, assists humans because strong groups are stronger than the group next door, which is not so strong because they don't have these cohesive beliefs.
[00:09:04.840 --> 00:09:15.000] So, he gives examples like religion, things like money, and of course, we all know, you know, bitcoins are nothing real.
[00:09:15.000 --> 00:09:21.000] Actually, the stock market is not real, but we believe in all these things because we convince ourselves that that's what we believe in.
[00:09:21.240 --> 00:09:24.040] This stuff is worthless, it's just paper and ink.
[00:09:24.040 --> 00:09:24.840] Exactly.
[00:09:24.840 --> 00:09:35.560] And it's only worth something because you trust the system that when you go to the supermarket, you can actually buy something using just a piece of paper.
[00:09:35.560 --> 00:09:41.800] So, that's very convincing, and it explains nicely why societies are the way they are.
[00:09:41.800 --> 00:09:45.080] But he never explains where language came from in the first place.
[00:09:45.720 --> 00:09:49.400] Same with some of the other, sorry, I don't remember what it was.
[00:09:49.560 --> 00:09:54.200] Well, you had, yes, Terence Deacon's, the symbolic species.
[00:09:54.520 --> 00:10:02.280] I don't know if he offered a first origin, but at least the interaction once it gets rolling.
[00:10:02.280 --> 00:10:03.480] Right between brain structures.
[00:10:03.560 --> 00:10:04.440] Oh, the symbolic species.
[00:10:04.600 --> 00:10:13.560] Yeah, the brain structure and language each affect each other in an autocatalytic system, and they kind of develop together.
[00:10:13.880 --> 00:10:23.960] Yes, but he's the one who first came up with the idea of seeing language as a virus, a virus well adapted to the brains of babies, which I think is just brilliant.
[00:10:23.960 --> 00:10:28.600] Because I think, well, to me, that's extremely compelling.
[00:10:28.600 --> 00:10:37.240] Because, so I have this tiny granddaughter, she's actually not that tiny anymore because she's growing like crazy, and she's she's discovering her voice.
[00:10:37.240 --> 00:10:53.280] And of course, she's just mimicking what her parents and other people are doing, but she's a sponge, so she soaks up all the information around her, which of course comes from all the interactions with adults and others around her.
[00:10:53.920 --> 00:11:07.600] And the language is so, if you believe deacon, which as I said is very compelling, language is basically designed to do well in baby brains.
[00:11:07.600 --> 00:11:12.480] So that's why babies are so good at absorbing language.
[00:11:12.480 --> 00:11:27.680] You don't have to teach them, they just listen and they figure out from interactions: oh, okay, if someone is looking at that and they always use that word, then that word must mean that particular object that I always see when they mention it.
[00:11:27.680 --> 00:11:34.800] And it can be green or blue, or it can be big or small, but I'll give the example of the ball.
[00:11:35.120 --> 00:11:44.640] If it's round, you can pick it up, no matter what size it is, no matter what colour, no matter where it is, ball is linked to that particular object.
[00:11:44.640 --> 00:11:46.800] And that's how they learn.
[00:11:47.120 --> 00:11:52.080] And languages, of course, would go extinct if we wouldn't be speaking them.
[00:11:52.080 --> 00:11:55.040] So language is really akin to a virus.
[00:11:55.040 --> 00:12:02.000] If there's no brains to live in, and brains that transmit it to another brain, then languages would go extinct.
[00:12:02.000 --> 00:12:04.240] Because humans also depend on language.
[00:12:04.240 --> 00:12:15.120] I mean, we would still be alive if we wouldn't be able to speak, but we would certainly become a very different kind of species, I think.
[00:12:15.120 --> 00:12:20.480] So I really like his view of language as a virus.
[00:12:20.480 --> 00:12:38.200] And it completely fits with my idea that language is needed for babies, to raise babies, but of course, the babies also need the language because they have to be able to manipulate others around them to look after them for about 15 years because that's how long it takes them to become independent.
[00:12:38.200 --> 00:12:38.920] Yeah.
[00:12:39.160 --> 00:12:43.640] Since you mentioned the association between words and objects.
[00:12:43.880 --> 00:12:52.440] So go back a little bit in history to Skinner's theory of language acquisition and then what Chomsky did to challenge that.
[00:12:53.400 --> 00:13:10.760] Well Skinner saw language as sort of an association game, which in a way it is, because you, as I said, the example that I gave of the ball, if you always hear the word ball and it's a particular object that looks around and you can pick it up, then you start making that association.
[00:13:10.760 --> 00:13:14.680] But the association is, of course, not the same as syntax.
[00:13:14.680 --> 00:13:24.360] So Chomsky came up with this idea, which is based on the fact that children just effortless learn languages, no matter how many they're exposed to when they're really young.
[00:13:24.360 --> 00:13:30.520] They just pick them up, which I find is really mean because I'm learning Spanish at the moment and it's really hard.
[00:13:31.560 --> 00:13:42.440] So he came up with this innate language acquisition device, which is basically something I always see the skin as sort of a miracle.
[00:13:42.440 --> 00:13:51.160] So all of a sudden humans had this language acquisition device, which means that babies understand syntax.
[00:13:51.160 --> 00:13:57.560] So from that the idea became okay that there must be a language gene.
[00:13:57.560 --> 00:14:14.760] So there's a gene that makes us different from all our ancestors, from chimpanzees, and that gene allowed our babies to understand language with no particular learning skills or whatever.
[00:14:15.440 --> 00:14:25.040] Now, of course, people started looking for the language gene, and at some stage, people thought they found a language gene in the form of FOXP2.
[00:14:26.320 --> 00:14:34.960] Which is interesting because FOXP2 is a gene that is important for vocalization, but not just in humans.
[00:14:34.960 --> 00:14:41.360] In almost all vertebrates, FOXP2 is present and it's important in focalization.
[00:14:41.360 --> 00:14:48.240] If you knock it out in mice, the baby mice don't produce a sound anymore that attracts the mother.
[00:14:50.720 --> 00:15:04.320] And then another study found or claimed that there were signs or signature of positive selection on fox P2 in our lineage.
[00:15:04.320 --> 00:15:13.200] So they took that to mean that clearly there's been selection on FOXP2 in humans, in Homo sapiens only.
[00:15:13.200 --> 00:15:20.160] So that means it really is very strong evidence that that's a gene for language.
[00:15:20.160 --> 00:15:28.640] And the later study first found that the same FOXP2 gene can be found in Yanditals and Dennis Ophens.
[00:15:29.280 --> 00:15:36.720] And then a bigger study that took much, many more samples totally debunked the positive selection.
[00:15:36.720 --> 00:15:41.120] So they found no evidence of positive selection on FOXP2.
[00:15:41.120 --> 00:15:46.320] And I think that was the nail in the coffin of FOXP2 being the language gene.
[00:15:46.320 --> 00:15:49.520] So language gene just doesn't exist.
[00:15:49.520 --> 00:16:00.040] Oh, I forgot to mention that it all started with this particular family in the UK that had a very weird inability to speak, basically.
[00:15:59.760 --> 00:16:06.120] So they seemed to be of average intelligence, but they could not speak properly.
[00:16:06.760 --> 00:16:08.600] And they all had a mutation.
[00:16:08.600 --> 00:16:11.720] And that mutation was in a particular part of a chromosome.
[00:16:11.720 --> 00:16:12.840] I can't remember which one.
[00:16:12.840 --> 00:16:19.160] And then later, in that particular part of their chromosome, they found the FOXP2 gene.
[00:16:19.160 --> 00:16:20.920] So that's how it all started.
[00:16:20.920 --> 00:16:23.080] And of course, it was a brilliant story.
[00:16:23.080 --> 00:16:24.760] It just didn't hold up.
[00:16:24.760 --> 00:16:25.880] Well, there's no language.
[00:16:26.600 --> 00:16:34.920] Would it be correct to say that the FOXP2 gene is necessary but not sufficient to explain language acquisition?
[00:16:36.120 --> 00:16:39.960] It's necessary for vocalization, yes.
[00:16:39.960 --> 00:16:40.760] Oh, for vocalization.
[00:16:41.000 --> 00:16:42.360] But not for comprehension.
[00:16:42.360 --> 00:16:49.320] In other words, the members of this family could comprehend what somebody was saying, but they just couldn't answer.
[00:16:49.960 --> 00:16:51.240] I think so, yes.
[00:16:51.240 --> 00:16:58.680] But the other thing that they couldn't do, so apparently it would keep, and I found it in Pinker's book.
[00:16:59.160 --> 00:17:08.520] But little children, if you make up a verb, then they can infer what the past tense of that verb is because they just know the rules.
[00:17:08.840 --> 00:17:11.240] And these people couldn't do that either.
[00:17:11.240 --> 00:17:12.040] Oh, like.
[00:17:12.120 --> 00:17:13.400] So the comprehension.
[00:17:13.400 --> 00:17:13.720] Yeah.
[00:17:13.720 --> 00:17:14.680] Like add an ED.
[00:17:14.840 --> 00:17:15.320] They didn't have a.
[00:17:15.960 --> 00:17:17.160] They couldn't get that interesting.
[00:17:17.240 --> 00:17:23.000] I wonder if they could read, comprehend through reading, if not hearing.
[00:17:23.000 --> 00:17:26.840] Or if they could type out a response but not say it.
[00:17:28.120 --> 00:17:31.480] I'm not sure because writing is much more difficult than speaking, though.
[00:17:31.480 --> 00:17:34.840] Yeah, but writing is only about 5,000 years old.
[00:17:34.840 --> 00:17:35.720] Yes, that's right.
[00:17:35.720 --> 00:17:42.840] But I was just thinking of stroke patients who cannot speak, but they can type out what they're trying to say.
[00:17:42.840 --> 00:17:46.240] But that would be somebody who already has language acquisition.
[00:17:46.240 --> 00:17:46.720] Yeah.
[00:17:44.840 --> 00:17:50.400] But also to mention where in the brain the damage is.
[00:17:44.840 --> 00:17:51.760] I mean, that's the other interesting thing.
[00:17:52.000 --> 00:17:58.240] So, you know, people have also been looking for the part of your brain that's responsible for language.
[00:17:58.560 --> 00:18:10.880] But different kinds of damage have in your brain have different effects on your linguistic abilities, understanding, comprehension, speaking.
[00:18:11.520 --> 00:18:13.840] Oh, is that Broca's area?
[00:18:14.720 --> 00:18:15.040] I forget.
[00:18:15.120 --> 00:18:15.600] Well, that's one of the things.
[00:18:15.760 --> 00:18:18.640] I forget what that was for just for comprehension, maybe.
[00:18:18.960 --> 00:18:19.440] Yeah.
[00:18:19.760 --> 00:18:28.960] But now, this is a more recent study which looked at a geometry of the brain, which claims that there aren't.
[00:18:28.960 --> 00:18:40.640] Well, it's not saying that there aren't specific areas in the brain that do something for something specific, but that the connections within the brain are much more fluid.
[00:18:40.960 --> 00:18:49.440] It's more like waves, information waves going through the brain instead of particular parts of the brain being very essential for particular behaviors.
[00:18:49.440 --> 00:18:50.000] Yeah.
[00:18:50.000 --> 00:18:53.680] So your point is having the Fox P2 gene isn't enough.
[00:18:53.680 --> 00:19:01.760] So Neanderthals and Denisovans had them, had that, but we don't, that doesn't automatically mean they could speak.
[00:19:01.760 --> 00:19:05.920] Maybe they could, maybe they had language, but we just don't know that for sure, right?
[00:19:06.880 --> 00:19:11.440] No, the trouble is we never can, but I strongly suspect that they couldn't.
[00:19:11.440 --> 00:19:13.280] At least not the way we speak.
[00:19:13.280 --> 00:19:16.240] For the simple reason that they're, well, two reasons.
[00:19:16.240 --> 00:19:19.680] Their anatomy wasn't right, so their necks were too short.
[00:19:19.680 --> 00:19:24.960] Of course, I have to caveat that because we don't know anything about the Denisolvans.
[00:19:24.960 --> 00:19:26.160] You say Denisophants?
[00:19:26.200 --> 00:19:29.880] I think that's I think you're saying it the correct way.
[00:19:30.200 --> 00:19:30.920] Oh, good.
[00:19:30.920 --> 00:19:31.960] I'm glad.
[00:19:29.440 --> 00:19:36.520] So we don't know anything about them because we have a pinky and I think there's a molar.
[00:19:36.680 --> 00:19:40.840] So there's a lot of DNA, but no idea what they look like.
[00:19:40.840 --> 00:19:45.160] But Neanderthals seem to have had a neck that's too short.
[00:19:45.160 --> 00:19:57.880] Too short in the sense that the larynx would sit too low in the neck to be able to make the sounds that we can make.
[00:19:58.200 --> 00:20:42.520] Then there's this other more recent study that looks at epigenetic marks and they found that so if you look at the differences in the epigenome, so that's not the sequence of DNA, but the methylation pattern of the DNA, if you compare that between they used chimpanzees, Neanderthals, Denisulphans, and Homo sapiens, and then they found that the biggest difference, I think it was 53 differently methylated genes in humans that all have something to do with the larynx and the vocal cavity, also with the pelvis, but that's not interested for not interesting for language.
[00:20:42.760 --> 00:20:58.360] So those two pieces of information I think will make me believe, or believe is a bad word, but make me think that we are the only ones that have this precise form of communication.
[00:20:58.360 --> 00:21:18.720] That's not to say that Neanderthals and Denisulphans didn't have any communication, because I think all humans, you know, and now with humans, I mean all our ancestors when we split from the common ancestor with chimpanzees or bonobos, they all must have had some form of communication because all social animals have a form of communication.
[00:21:19.040 --> 00:21:28.720] I'm just arguing that we are the only ones with a sophisticated form of communication, which is due to many flukes of nature and the need to look after our babies.
[00:21:29.040 --> 00:21:29.840] Yeah.
[00:21:29.840 --> 00:21:42.160] So, yeah, maybe make a distinction between language and communication since you wrote about ants solving the traveling salesman problem of how to, you know, cover a certain amount of territory the most efficient way.
[00:21:42.160 --> 00:21:48.000] Of course, they don't have symbolic language or whatever, so communication can be done, but it lacks what?
[00:21:48.320 --> 00:21:54.320] Symbolic, you know, symbolism as a form of communication?
[00:21:55.280 --> 00:21:58.720] Well, I think you need symbolism before you can have a symbolic language.
[00:21:58.720 --> 00:22:00.480] And of course, our language is symbolic.
[00:22:00.480 --> 00:22:07.760] I mean, the other symbolic language which is quite different is the bees dance language.
[00:22:07.760 --> 00:22:14.320] So they code distance and profitability in the way they dance.
[00:22:15.440 --> 00:22:20.320] If the source that they're dancing for is a long way away, they dance for longer.
[00:22:20.320 --> 00:22:35.760] And the direction in which you have to fly to get there is determined by, or is symbolized in the angle relative to the sun, and how great the food source is that they've found is reflected in how enthusiastic they dance.
[00:22:35.760 --> 00:22:41.520] So that's a symbolic language, but it only has three bits of information.
[00:22:42.720 --> 00:22:47.200] Our language is special.
[00:22:47.200 --> 00:22:51.200] And of course, you know, cetaceans, they have forms of communication.
[00:22:51.200 --> 00:22:59.040] I was actually reading or hearing falls here, and for me, that means comfort meals, cozy nights, and tail-gating weekends.
[00:22:59.040 --> 00:23:01.800] And Omaha Steaks makes it all easy.
[00:22:59.840 --> 00:23:05.960] I love having their premium steaks and juicy burgers ready in my freezer.
[00:23:06.280 --> 00:23:13.000] I recently grilled their filet mignon, so tender, flavorful, and better than anything I've had elsewhere.
[00:23:13.000 --> 00:23:21.560] Right now, during their red-hot sale, you get 50% off site-wide, plus get an extra $35 off with code flavor at checkout.
[00:23:21.560 --> 00:23:24.680] Get fired up for fall grilling with Omaha Steaks.
[00:23:24.680 --> 00:23:30.520] Visit Omahasteaks.com for 50% off site-wide during their red-hot sale event.
[00:23:30.520 --> 00:23:35.080] And for an extra $35 off, use promo code flavor at checkout.
[00:23:35.080 --> 00:23:42.280] That's 50% off at Omahasteaks.com and an extra $35 off with promo code flavor at checkout.
[00:23:42.280 --> 00:23:43.800] See site for details.
[00:23:43.800 --> 00:23:52.760] Hearing, I can't quite remember that dolphins give themselves a particular name, so it's a particular sound that they only use to refer to themselves.
[00:23:52.760 --> 00:23:55.800] That, of course, is also symbolic.
[00:23:56.120 --> 00:23:59.000] But our language is infinite.
[00:23:59.000 --> 00:24:05.000] So we can use words in different syntax basically to mean different things.
[00:24:05.000 --> 00:24:14.440] So we can basically say whatever we want, and we can talk about the past, we can talk about what's happening now, we can talk about the future, we can talk about things that don't exist.
[00:24:14.440 --> 00:24:18.040] I mean, that's all Harari's argument.
[00:24:18.280 --> 00:24:22.680] You know, we talk about things that don't exist and pretend that they do exist.
[00:24:22.680 --> 00:24:31.320] So that's what makes our language, our form of communication, special or unique and special from everything else.
[00:24:31.560 --> 00:24:33.720] But bacteria, they communicate.
[00:24:33.720 --> 00:24:34.520] Yes, right.
[00:24:34.520 --> 00:24:34.920] Yeah.
[00:24:35.240 --> 00:24:42.520] Yeah, it does feel like there's a big gap there that sounds where explanations sound like, and then a miracle happened.
[00:24:42.760 --> 00:24:44.200] You know, and then we have this thing.
[00:24:44.200 --> 00:24:51.520] So, I mean, you're point that we still have to find a gradual evolutionary sequence of how this came about and why.
[00:24:51.840 --> 00:24:56.960] So, maybe speak for just a moment back to general how natural selection works.
[00:24:57.120 --> 00:25:01.360] Why do you think it's natural selection and not sexual selection, or do you think it's both?
[00:25:01.360 --> 00:25:06.000] And is language an adaptation or more of an exaptation spandrel?
[00:25:06.320 --> 00:25:10.640] Why would you think it could be either one of those?
[00:25:10.640 --> 00:25:17.440] Maybe just kind of give that general how evolution works with the context of language development.
[00:25:17.760 --> 00:25:38.800] Well, of course, in general, how evolution works, if you have, if you as an individual have some sort of trait that gives you a leg up in competition with others in your population, leg up in a sense that you can produce more offspring or you can produce them more efficiently, then that will be selected for if it is heritable.
[00:25:38.800 --> 00:26:01.600] So, my argument is that because we had all these genetic and morphological flukes in our evolutionary history that I describe in the book, at some stage we had this really strange fluke in the sense that we had a pseudogene that's still a pseudogene in gorillas and chimpanzees, got repaired, duplicated.
[00:26:01.600 --> 00:26:11.920] We have all these inversions and chromosomes, which are weird genetic aberrations, and that basically gave us more neurons.
[00:26:11.920 --> 00:26:14.960] So, our head started to balloon.
[00:26:15.280 --> 00:26:22.560] If your head starts to balloon, you have to come up with a solution because, you know, your skull needs to adapt.
[00:26:22.560 --> 00:26:30.280] The skull is a very malleable structure because it's comprised of all these different components, it sort of can move.
[00:26:29.440 --> 00:26:34.200] So the skull could adapt to this expanding brain.
[00:26:34.520 --> 00:26:47.000] So just reiterate again, the brain expanded because of genetic fluke, a duplication or triplication of a particular gene that prevented the stem cells of neurons to keep dividing for longer.
[00:26:47.000 --> 00:26:53.080] So you get more neurons, the neurons migrate to the cortex, the cortex expands.
[00:26:53.400 --> 00:26:59.480] The skull adapted to this increase in brain mass to basically make it fit.
[00:26:59.480 --> 00:27:03.880] And that then led to a lengthening of the throat, a descent of the larynx.
[00:27:04.040 --> 00:27:18.360] All of a sudden, in a species that's already highly social, we started to be able to make more precise sounds just because of these changes in our morphology.
[00:27:18.680 --> 00:27:22.040] Now, so these are all flukes in a way.
[00:27:22.040 --> 00:27:25.160] But precise communication was an adaptation.
[00:27:25.160 --> 00:27:27.160] So I don't think it was an acceptation.
[00:27:27.160 --> 00:27:36.280] I think it was an adaptation because this now made it possible to be a bit more precise in the way you communicate.
[00:27:36.920 --> 00:27:43.400] If you were more precise in the way that you communicate, you can elicit more help.
[00:27:43.720 --> 00:28:04.680] Little babies that were now born even more prematurely because of this big brain, they could no longer stay in the mother because otherwise either they wouldn't be able to fit through the pelvis, through the birth canal, but also because their brain was so large, it's metabolically extremely expensive.
[00:28:04.680 --> 00:28:10.360] So, for the mother to increase the length of the pregnancy, it's just metabolically not possible.
[00:28:10.360 --> 00:28:14.280] So, two reasons why those babies needed to be born very prematurely.
[00:28:14.280 --> 00:28:36.560] You now have a very premature brain in a little baby that's outside of the womb, surrounded by other individuals, because our ancestors were already social because they decided to become bipedal in an environment which is extremely dangerous because of all the predators that can run much faster.
[00:28:36.560 --> 00:28:50.320] So, my argument is from Lucy onwards, so that's Astrolopithecus afarensis, 4.4 million years ago or so, our ancestors already started to live in family groups.
[00:28:50.320 --> 00:28:53.520] And these family bonds just became stronger and stronger.
[00:28:53.520 --> 00:29:01.120] So, we already have this very strong social kin group structure, which requires communication.
[00:29:01.120 --> 00:29:12.400] As I said earlier, all social animals need some form of communication, but now through all these flukes of nature, we had a means to be a little bit more precise in our communication.
[00:29:12.400 --> 00:29:31.360] The babies, because they're so neuroplastic, pick up all those social cues, and also for them to be able to elicit more help by some form of more precise communication gives them a leg up because now they get more resources relative to other babies.
[00:29:31.360 --> 00:29:50.640] So, you get this positive selection, I think, also as high or positive feedback, I should say, between just being a little bit better at communication, making it a little bit easier to raise those dependent babies, which then selects for getting better still.
[00:29:50.640 --> 00:30:03.560] And you basically have runaway selection between the need to look after those extremely needy babies and a form of communication that can be extremely precise.
[00:30:04.200 --> 00:30:25.240] Once, of course, you have something like language, then you can get all the other things, then you can have sexual selection come and play a role, the mating mind like Miller, or all the fictions that Harare talks about that makes human societies even stronger because of these shared beliefs.
[00:30:25.560 --> 00:30:34.680] But I still think if you drill down to the essence of Darwinian natural selection, it has to have something to do with giving you a reproductive advantage.
[00:30:34.680 --> 00:30:40.040] So that's why I think language is an adaptation and not an exaptation.
[00:30:40.040 --> 00:30:44.520] And it has everything to do with increasing reproductive output.
[00:30:45.800 --> 00:30:46.600] Perfectly said.
[00:30:46.600 --> 00:30:47.960] Wow, incredible.
[00:30:47.960 --> 00:30:49.560] You're so articulate on this.
[00:30:49.560 --> 00:30:51.000] Of course, you're the author of the book.
[00:30:51.000 --> 00:30:52.680] You wrote about Alfred Russell Wallace.
[00:30:52.840 --> 00:30:55.000] You know, I wrote a biography of him.
[00:30:55.000 --> 00:30:58.040] That was my doctoral dissertation in the history of science.
[00:30:58.440 --> 00:31:12.920] He was kind of a hyper-adaptationist, such that if he could not think of what the adaptive purpose would be of having a brain, you know, four times the size that we really need it, chimps do just fine with a much smaller brain.
[00:31:12.920 --> 00:31:14.280] What do we need that for?
[00:31:14.280 --> 00:31:23.240] Why do you need to be able to abstractly reason to be able to do calculus, for example, or aesthetic appreciation like music and so on?
[00:31:23.240 --> 00:31:33.880] Not being able to think of how that could be adaptive to lead more offspring and so on, then he invoked, well, there must be some higher power that kind of plopped that in there.
[00:31:34.120 --> 00:31:44.520] So that's one of the dangers of hyper-adaptationism, which is why Steve Gould always kind of pushed back against that, because some people pushed, like Wallace pushed it too far.
[00:31:44.960 --> 00:31:54.160] But in terms of a Darwinian explanation, then let's talk about the great apes and their language as something of a bridge, maybe.
[00:31:54.160 --> 00:31:57.440] I think what you're arguing goes beyond what they're able to do.
[00:31:57.440 --> 00:32:04.320] And they did not have those quirky little contingent events that happened in their skulls and so forth.
[00:32:04.320 --> 00:32:07.680] And yet they still have some symbolic communication, right?
[00:32:07.680 --> 00:32:15.440] So you wrote about Kanzi, the bonobo, and gorillas like Coco and Nimchimsky.
[00:32:16.240 --> 00:32:17.440] I saw that documentary.
[00:32:17.440 --> 00:32:21.040] That was a pretty depressing, sad, tragic story.
[00:32:21.040 --> 00:32:22.880] It was very sad, actually.
[00:32:23.440 --> 00:32:28.480] What was the name of the famous psychologist that ran that experiment?
[00:32:28.480 --> 00:32:29.840] I'm trying to remember his name.
[00:32:29.840 --> 00:32:31.040] It was infuriating.
[00:32:31.760 --> 00:32:33.200] Sharon's something rather, I think.
[00:32:33.520 --> 00:32:40.480] Yeah, but that was the 1960s when there was less concern about animal cruelty and rights and so forth.
[00:32:41.520 --> 00:32:44.400] But, you know, we've kind of at Skeptic, we've sort of followed the debate.
[00:32:44.400 --> 00:32:47.840] You know, is this a form of language?
[00:32:47.840 --> 00:33:12.320] You know, where you see these videos, like Sue Savage Ramba's videos with Kanzi, you know, they're impressive, but then there are critics that say, well, but there could be more of a clever Hans effect where the chimp or the gorilla is reading the body language of the controller or the, you know, their caretaker, and they're kind of doing a more of a Skinnerian association.
[00:33:12.320 --> 00:33:14.400] In other words, the lights aren't on upstairs.
[00:33:14.400 --> 00:33:20.480] They're just kind of processing it in some more Skinnerian stimulus response way.
[00:33:20.800 --> 00:33:33.880] But you made the case in your book that Kanzi probably does have more comprehension of language and words than just what a Skinnerian association model would provide.
[00:33:35.160 --> 00:33:39.720] Well, body language is the start to any form of communication, I think.
[00:33:40.680 --> 00:33:44.600] So I wouldn't say that that is something completely different.
[00:33:44.600 --> 00:33:46.920] I mean, we all use body language.
[00:33:46.920 --> 00:33:54.360] Even now, you know, we have the sophisticated form of communication, but we still really rely on body language.
[00:33:54.520 --> 00:34:09.800] So I think that's one of the first, well, essential, basically, you need to be able to read someone else's body language before you can have any other form of communication.
[00:34:09.800 --> 00:34:16.280] Now, someone did a, I really love those videos, by the way, of Serbin and Kansy.
[00:34:18.920 --> 00:34:22.040] So someone did an analysis.
[00:34:22.040 --> 00:34:29.240] So he went through all the videos that are available online from all the experiments that she did, if you call them experiments.
[00:34:29.240 --> 00:34:33.320] It's more, you know, living with Kenzie in a way.
[00:34:33.960 --> 00:34:44.760] And Stephen Pinker once made, used the example in his book, Dog Bites Man, will not get into the newspaper, but man bites dog probably will.
[00:34:44.760 --> 00:34:49.720] So you have the same words, different syntax, completely different meaning.
[00:34:49.720 --> 00:35:04.280] So he looked at all the videos of Cansey and all the phrases and sentences that the researchers used and came to the conclusion.
[00:35:04.280 --> 00:35:09.560] It's been a while since I read that paper, so forgive me, I don't know, I can't remember all the details.
[00:35:09.560 --> 00:35:23.040] But he came to the conclusion that it's Kenzie does comprehend language because he understands that words in a different order mean something different.
[00:35:23.040 --> 00:35:35.920] So then, if that is true, then you have to appreciate that he does understand language as such and not just use the Skinnerian associations.
[00:35:36.880 --> 00:35:48.400] But still, quite a quantitative difference between what Kanzi and Coco could do in NIM versus what humans could do.
[00:35:48.400 --> 00:35:52.160] Is it quantitatively different enough to be qualitatively different?
[00:35:52.160 --> 00:35:56.320] In other words, Kanzi can't tell me that his father was poor but honest, right?
[00:35:56.640 --> 00:35:58.240] Or something like that.
[00:35:58.240 --> 00:35:58.800] No.
[00:36:00.080 --> 00:36:00.880] No, definitely.
[00:36:00.880 --> 00:36:03.200] I mean, there's a difference in brain size too.
[00:36:03.200 --> 00:36:03.760] Yeah.
[00:36:03.760 --> 00:36:05.280] And there's lots of differences.
[00:36:05.280 --> 00:36:13.440] So I'm not saying that Kenzie would be able to speak like you and I are speaking now if he just had the different morphology.
[00:36:13.760 --> 00:36:15.520] But the two things go together again.
[00:36:15.520 --> 00:36:23.600] I mean, Darwin himself made that remark that other primates probably could.
[00:36:23.920 --> 00:36:37.920] So their focal cords, I think he made the argument that there's no morphological reason why they wouldn't be able to speak, but it's the lack of brain size that prevented any further increase in their ability to communicate.
[00:36:38.240 --> 00:36:40.800] And I think he was dead right.
[00:36:41.120 --> 00:36:52.320] So they don't have the morphology to be able to make the precise signs that we are making, which then, of course, means that there was no positive feedback again on the brain either.
[00:36:52.320 --> 00:36:56.640] But also, they didn't have this weird fluke that made their brain balloon.
[00:36:56.640 --> 00:37:01.080] So, that's why the story is not a linear story.
[00:36:59.440 --> 00:37:09.400] So, people like thinking in linearities, but it's not, there's all these things that go together that happened sort of at the same time.
[00:37:09.400 --> 00:37:17.480] And, in my view, every time one of those weird flukes happened in our evolutionary history, natural selection just had to make do.
[00:37:17.480 --> 00:37:28.440] Either I'm going to let this weird species go extinct because now it's got this strange fluke and I don't know what to do with it, or okay, we'll find a solution.
[00:37:28.440 --> 00:37:36.120] And natural selection, every time in our evolutionary history, find a solution that just, okay, we can cope with this weird fluke now.
[00:37:36.120 --> 00:37:38.280] Okay, they're now walking on two legs.
[00:37:38.280 --> 00:37:49.080] Well, then they basically have to stick together because otherwise they will be predated on because the poor buggers can't even run fast enough because they started walking on two legs.
[00:37:49.080 --> 00:37:57.640] The pelvis had to change even more to make it to increase the balance.
[00:37:57.640 --> 00:38:02.680] That then led them, Homo erectus, to be able to run long distances.
[00:38:02.680 --> 00:38:06.040] If you run long distances, you have to be able to sweat, etc.
[00:38:06.120 --> 00:38:20.440] So, every time there was some weird event in our evolutionary history, natural selection allowed our species, our ancestors, to cope with it until the next one came.
[00:38:20.440 --> 00:38:22.680] So, that's how I sort of see it.
[00:38:22.680 --> 00:38:38.520] There's all these weird things, weird things just happen, and then either it's selected against, or it doesn't necessarily have to be selected for, it just has to be good enough to be able to stay alive and reproduce.
[00:38:38.920 --> 00:38:46.400] People tend to think that natural selection drives species towards something, but of course it doesn't.
[00:38:46.400 --> 00:38:48.800] Either you stay alive or you don't.
[00:38:44.840 --> 00:38:50.560] Those are the two options.
[00:38:50.880 --> 00:38:53.360] Yeah, there's no directionality in evolution.
[00:38:53.360 --> 00:38:58.240] It seems like it in hindsight, and we put ourselves at the pinnacle.
[00:38:58.240 --> 00:39:02.400] But, you know, if big brains were so great, how come almost no species have them?
[00:39:02.400 --> 00:39:03.360] They do just fine.
[00:39:03.360 --> 00:39:04.560] This is Wallace's point.
[00:39:04.560 --> 00:39:07.920] Most species do just fine with tiny little brains.
[00:39:08.560 --> 00:39:13.200] Well, most species can't have a large brain because they're so expensive.
[00:39:13.200 --> 00:39:13.680] Yeah.
[00:39:14.000 --> 00:39:36.160] So one reason, again, a very compelling argument for why the megafauna went extinct as soon as humans made an appearance is because these species were so huge, it takes a very long time to mature, which is fine if you don't have the risk of being killed off before you start reproducing.
[00:39:36.160 --> 00:39:41.360] So in the absence of predators, species can grow to enormous size.
[00:39:41.680 --> 00:39:44.560] Humans came along, they started picking them off.
[00:39:45.200 --> 00:39:50.080] It's not even necessary to assume that humans deliberately targeted the megafauna.
[00:39:50.080 --> 00:39:56.720] It's just megafauna, you know, you can't ignore a woolly mammoth, for example, because it's so big.
[00:39:56.720 --> 00:40:09.280] So they started killing them, which then selected against these very large animals because many of them would be killed before they're old enough, mature enough to reproduce.
[00:40:09.280 --> 00:40:11.360] The same is true of the brain.
[00:40:11.360 --> 00:40:17.360] So brain size is limited by body size.
[00:40:17.680 --> 00:40:25.200] If you want to grow a very large brain, sorry, not body size, it just takes a very long time to grow a large brain.
[00:40:25.200 --> 00:40:44.680] So, if you want, if you want, in the evolutionary sense, if you want to grow a large brain, you have to mature very slowly just because it takes how long to grow that brain, which makes you vulnerable, just like a megafauna, to die before you're mature enough to produce offspring.
[00:40:44.680 --> 00:40:47.160] So, it's a very tricky strategy.
[00:40:47.160 --> 00:41:06.600] So, only so I use this example of a meta-analysis that's been done on birds, where they basically try to figure out what limits brain size in birds, because there are many very clever birds and they have a large brain.
[00:41:06.600 --> 00:41:24.280] Every single one of them has very extended parental care, which means that the parents, while the baby bird or the young bird is growing its brain, the brain isn't sophisticated enough to be able for the bird to look after itself.
[00:41:24.280 --> 00:41:32.120] So, it can only get away with having this slow maturing large brain if it has other individuals looking after it.
[00:41:32.120 --> 00:41:34.200] And the same is true in our case.
[00:41:34.200 --> 00:41:44.040] Our babies could not get away with being born so totally helpless if there weren't all these other individuals around them to help raise them.
[00:41:44.200 --> 00:41:45.560] So, there's a trade-off.
[00:41:45.560 --> 00:41:54.440] So, most species just can't have a large enough brain because they don't have all the other conditions around it.
[00:41:54.440 --> 00:41:57.400] And that's called the grey ceiling hypothesis.
[00:41:57.400 --> 00:42:09.160] So, there's a particular brain size beyond which it's very difficult to, well, you basically can't get through this grey ceiling unless you're a hypersocial species.
[00:42:09.160 --> 00:42:11.800] And that's also where grandmothers come in.
[00:42:11.800 --> 00:42:20.720] So, the menopause or the grandmother hypothesis: why do human females stop reproducing or stop, yeah, well, yes, stop reproducing.
[00:42:14.680 --> 00:42:21.280] Two reasons.
[00:42:21.520 --> 00:42:30.000] One's if one, if it takes so long for your offspring to mature, if you reproduce at a every business has an ambition.
[00:42:30.000 --> 00:42:39.440] PayPal Open is the platform designed to help you grow into yours with business loans so you can expand and access to hundreds of millions of PayPal customers worldwide.
[00:42:39.440 --> 00:42:47.840] And your customers can pay all the ways they want with PayPal, Venmo, Pay Later, and all major cards so you can focus on scaling up.
[00:42:47.840 --> 00:42:52.720] When it's time to get growing, there's one platform for all business: PayPal Open.
[00:42:52.720 --> 00:42:55.280] Grow today at paypalopen.com.
[00:42:55.280 --> 00:42:57.920] Loan subject to approval in available locations.
[00:42:58.080 --> 00:43:03.040] Old age, then you're very likely that you die before your child is independent.
[00:43:03.040 --> 00:43:11.920] And then it's much better evolutionary scene to help raise your children, raise their children.
[00:43:11.920 --> 00:43:14.800] In other words, raise your grandchildren.
[00:43:15.120 --> 00:43:19.840] So grandmothers in particular, people don't tend to talk about grandfathers so much.
[00:43:19.840 --> 00:43:20.240] Yeah, hey.
[00:43:21.600 --> 00:43:22.960] I know, I know, I know.
[00:43:25.040 --> 00:43:26.320] Well, there are two reasons.
[00:43:26.320 --> 00:43:36.800] One is that grandfather or fathers in particular, in general, I should say, are never 100% sure that the children they're raising are theirs genetically.
[00:43:37.360 --> 00:43:45.920] So there's paternity uncertainty, which leads to a lot of very interesting behaviors in many animals, including humans, actually.
[00:43:46.480 --> 00:43:47.520] So that's one.
[00:43:47.520 --> 00:43:51.040] But also, for males, there is no incentive.
[00:43:51.040 --> 00:43:56.400] There's no reason why they should not stop reproducing because they can always find a younger female.
[00:43:56.400 --> 00:44:07.080] So those two reasons sort of not select against, but they don't make grandfathering as profitable in a kin selection sense as grandmothering is.
[00:44:07.400 --> 00:44:07.720] Yeah.
[00:44:08.520 --> 00:44:09.880] I have nothing against grandfathering.
[00:44:10.040 --> 00:44:12.840] Mama's baby, daddy's maybe, right?
[00:44:13.160 --> 00:44:14.760] Exactly, exactly.
[00:44:15.880 --> 00:44:20.520] Yeah, well, that explanation also goes a long way to explaining why we die.
[00:44:20.520 --> 00:44:22.680] You know, why can't we just live forever?
[00:44:23.240 --> 00:44:33.080] And the answer seems to be: well, once you've become a parent and then a grandparent, natural selection has pretty much done its job of getting your genes into the next generation.
[00:44:33.080 --> 00:44:44.040] And there's just no point in putting all that extra energy into keeping your body alive for 200 or 300 years when your genes are already into the future.
[00:44:45.320 --> 00:44:46.120] That's correct.
[00:44:46.120 --> 00:44:46.360] Yeah.
[00:44:46.360 --> 00:44:49.240] So that's the soma, what is it?
[00:44:49.240 --> 00:44:52.360] The soma death theory or whatever it's called.
[00:44:52.360 --> 00:44:52.760] Yeah.
[00:44:53.480 --> 00:44:53.800] Yes.
[00:44:53.800 --> 00:44:55.480] The disposable soma theory.
[00:44:55.480 --> 00:44:55.960] That's right.
[00:44:55.960 --> 00:44:56.360] That's right.
[00:44:56.360 --> 00:44:56.840] Yeah, that's right.
[00:44:57.320 --> 00:45:00.680] So keep you alive until you've reproduced and then you're done.
[00:45:00.680 --> 00:45:10.040] But of course, that changed in our lineage again, especially for females, because they live way beyond their reproductive age.
[00:45:11.560 --> 00:45:15.320] Of course, it doesn't mean that we can now live up to 200 years.
[00:45:15.320 --> 00:45:23.880] I know some people are convinced that they can, and they do all these weird experiments on themselves to see if they can get skeptic.
[00:45:23.960 --> 00:45:28.520] It's not going to happen unless there's some miracle breakthrough, technologically speaking.
[00:45:28.520 --> 00:45:34.440] Really, what's happening is modern medicine is getting more and more people up to the upper ceiling.
[00:45:34.600 --> 00:45:38.440] But no one's going past, say, 115 or 120.
[00:45:38.840 --> 00:45:41.080] Again, unless they're 122, apparently.
[00:45:41.080 --> 00:45:42.200] I was reading that the other night.
[00:45:42.200 --> 00:45:43.080] Yeah, maybe.
[00:45:43.040 --> 00:45:49.840] You know, some of these really old people, the reliability of the report, is there a birth certificate?
[00:45:44.840 --> 00:45:50.960] You know, we need to see that.
[00:45:51.280 --> 00:45:51.840] Right?
[00:45:51.840 --> 00:45:52.640] No, that's fair enough.
[00:45:52.640 --> 00:45:56.800] You know, about the blue zones where people supposedly live longer around the world.
[00:45:57.520 --> 00:46:04.400] But all that data now is called into question because it was all self-report data of how old they actually were.
[00:46:05.040 --> 00:46:06.400] Yeah, that's fair enough.
[00:46:06.400 --> 00:46:14.080] I did read, though, so the 122-year-old woman, she quit smoking at 117.
[00:46:14.080 --> 00:46:15.120] That's right.
[00:46:16.160 --> 00:46:20.640] Yeah, so confounding variables are a complicating factor for scientists.
[00:46:20.640 --> 00:46:20.960] Yeah.
[00:46:20.960 --> 00:46:22.480] How could that be?
[00:46:22.480 --> 00:46:26.800] And the guy that led the perfectly healthy life dropped dead at 35, right?
[00:46:27.120 --> 00:46:27.760] I know.
[00:46:28.000 --> 00:46:34.800] You know that guy that, if you like Netflix, that Brian Johnson, the tech billionaire, is trying to live, well, forever.
[00:46:35.040 --> 00:46:42.800] And he's got all the money and resources and help and technology to test his blood every day.
[00:46:42.800 --> 00:46:45.040] And he takes all these supplements.
[00:46:45.040 --> 00:46:48.480] He's even more fanatic than Ray Kurzweil about this.
[00:46:48.480 --> 00:46:54.320] But if you watch the Netflix, I don't know if it's Netflix, but anyway, there's a documentary about him where they follow him around.
[00:46:54.320 --> 00:46:56.240] To me, he doesn't look healthy.
[00:46:56.480 --> 00:47:01.600] I mean, he does all this stuff that's supposedly healthy, but he never goes outside because he doesn't want to get any sun.
[00:47:01.600 --> 00:47:02.240] Okay.
[00:47:02.560 --> 00:47:04.000] So he looks kind of pale.
[00:47:04.000 --> 00:47:10.320] And then he has dinner at 11:30 in the morning, and he goes to bed by 8:15 at the latest.
[00:47:10.320 --> 00:47:12.640] And the filmmaker is like, How's your life?
[00:47:12.640 --> 00:47:13.440] He's a single guy.
[00:47:13.440 --> 00:47:14.240] How's your love life?
[00:47:14.240 --> 00:47:16.080] He goes, Not very good.
[00:47:17.360 --> 00:47:17.840] I know.
[00:47:18.000 --> 00:47:25.440] I heard a podcast with him, and I thought, well, if that's, if you have to live a long time like that, then I prefer to die younger.
[00:47:25.680 --> 00:47:26.640] Doesn't sound like living.
[00:47:26.720 --> 00:47:27.760] I'd have some fun.
[00:47:27.760 --> 00:47:28.240] Yeah, yeah.
[00:47:28.960 --> 00:47:29.280] All right.
[00:47:29.440 --> 00:47:30.440] Yeah, the things people do.
[00:47:30.680 --> 00:47:44.440] Some of the other things that you talk about in the book that I enjoyed, Paul Ekman's research on reading emotions in people's faces and body language, and then Trivor's deceit and self-deception.
[00:47:44.440 --> 00:47:52.840] You know, where we evolved a reasonable capacity to detect when somebody else is trying to deceive us, but it's not perfect.
[00:47:52.840 --> 00:48:04.680] And if the person believes the lie, you know, believes what they're saying, then their body won't give off the cues that they're lying, which, you know, Trivor's argues is a heavier cognitive load.
[00:48:04.680 --> 00:48:11.960] So you have to keep track of all the stuff, and that gets reflected in your micro-expressions in your face, I guess, and whatever.
[00:48:12.280 --> 00:48:14.440] So I thought that was pretty interesting.
[00:48:15.400 --> 00:48:23.720] Well, so I don't know if you remember when you came to Australia, but we have very strict rules in what you are allowed to bring in.
[00:48:23.720 --> 00:48:29.000] So if you have anything made of timber, you're not allowed to declare it.
[00:48:29.000 --> 00:48:32.520] You're not allowed to bring in fruit, vegetables, etc., etc.
[00:48:33.160 --> 00:48:38.680] And at the time, I was still sort of moving from the Netherlands to Australia.
[00:48:38.680 --> 00:48:45.080] So I had packed some wooden things in the Netherlands, put them in my bag, completely forgot about them.
[00:48:45.080 --> 00:48:48.840] So on the plane, when you're about to land in Australia, you have to tick all the books.
[00:48:48.840 --> 00:48:51.320] No, I don't have any wooden items, etc.
[00:48:52.200 --> 00:48:56.440] I landed in Sydney and they had they x-rayed my bag.
[00:48:56.440 --> 00:48:58.920] Oh, no, there's always ask you questions.
[00:48:58.920 --> 00:49:01.000] You know, do you have anything that you need to declare?
[00:49:01.000 --> 00:49:07.400] And I said, no, I don't have anything to declare because I honestly totally forgot about these wooden things that I put in my bag.
[00:49:07.720 --> 00:49:09.640] And so I said, no, I've got nothing.
[00:49:09.640 --> 00:49:12.520] And then they said, okay, you can just go through.
[00:49:12.520 --> 00:49:16.800] At home, I opened my bag, and the first thing I saw were those wooden items.
[00:49:16.800 --> 00:49:28.080] And I thought, Oh, dear, if they would have x-rayed my bag and I would have and I had said, No, I don't have anything made of wood, I would have been in serious trouble because I would have thought that I lied.
[00:49:28.080 --> 00:49:31.040] But I wasn't lying because I completely forgotten.
[00:49:31.360 --> 00:49:34.000] So, clearly, my body language is very clear.
[00:49:34.160 --> 00:49:36.800] No, she doesn't have anything that she needs to declare.
[00:49:36.800 --> 00:49:41.440] So, I totally, I totally believe this self-deception.
[00:49:41.760 --> 00:49:49.600] And I'm sure you've had arguments with people who claim that they were certain that they were right about something, even though you know that they were wrong.
[00:49:49.600 --> 00:49:53.680] But they're just so convinced that there's no way you can convince them that they're wrong.
[00:49:53.680 --> 00:50:20.800] So, I think self-deception makes complete sense, but at the same time, it's also very scary because I seem to remember that he used examples of airline pilots who made terrible mistakes but were certain that they were right, really believed that they could do something or they basically convinced themselves that they were much better, which I think is often something you find in young men too.
[00:50:20.800 --> 00:50:29.120] One of the reasons why young men have a higher mortality than young females that believe too much in themselves.
[00:50:29.120 --> 00:50:35.600] Competing for the Darwin Award picking themselves out of the gene pool early before returning to them.
[00:50:35.680 --> 00:50:37.280] Is that still being handed out, by the way?
[00:50:37.520 --> 00:50:38.880] I haven't seen it in a few years.
[00:50:38.880 --> 00:50:40.880] I'm not sure that they do that anymore.
[00:50:41.040 --> 00:50:42.960] Because I'm sure there are many contestants.
[00:50:43.360 --> 00:50:54.000] Oh my god, well, you know, Twitter has endless feeds of videos of guys like, hold my beer while I jump off the roof and see if I can make it to the pool.
[00:50:54.320 --> 00:50:59.440] And there's just hundreds of them, just crazy stuff that people, mostly guys, right?
[00:50:59.440 --> 00:51:02.600] Probably, and this, you know, Jeffrey Miller would probably have something to say about this, right?
[00:51:02.840 --> 00:51:08.440] Status seeking amongst other guys, trying to impress women, you know, that sort of thing.
[00:51:08.440 --> 00:51:15.720] And the idea being, my genes are so good, I can do this crazy-ass, completely high-risk thing and still survive.
[00:51:15.720 --> 00:51:18.200] So that's how good I am.
[00:51:18.200 --> 00:51:19.080] Until they're not.
[00:51:19.240 --> 00:51:20.120] Until they're not, right.
[00:51:20.120 --> 00:51:21.800] Until they get taken out of the gene pool earlier.
[00:51:21.880 --> 00:51:22.680] But not all of them.
[00:51:22.680 --> 00:51:25.240] So some of that still exists in our gene pool.
[00:51:25.560 --> 00:51:25.880] I know.
[00:51:25.880 --> 00:51:34.840] I was going to ask you: how old would we be if your half of the species had hips big enough and a pelvic opening big enough to have a big brain baby at?
[00:51:34.840 --> 00:51:36.840] I don't know, what, 18 months or 20 months?
[00:51:37.080 --> 00:51:41.240] What would our natural birth gestation period be?
[00:51:41.400 --> 00:51:43.080] I think about twice as long.
[00:51:43.080 --> 00:51:45.320] Yeah, so 18 months.
[00:51:45.320 --> 00:51:45.720] Yeah.
[00:51:45.720 --> 00:51:53.640] Yeah, but it just doesn't work because the mother, the pregnant woman, can't, she just can't produce the energy.
[00:51:53.640 --> 00:51:54.680] Right, right.
[00:51:55.000 --> 00:51:58.600] Oh, so it's not just the pelvic opening, it's everything.
[00:51:58.600 --> 00:51:59.480] It's both.
[00:51:59.480 --> 00:52:00.200] Okay, all right.
[00:52:00.520 --> 00:52:08.200] It's just because the brain is so expensive, she will have to produce so much more energy to maintain herself and to grow this even bigger brain.
[00:52:08.200 --> 00:52:09.000] Yeah, yeah.
[00:52:09.320 --> 00:52:14.520] I wanted to read a little section from your book because I thought this was so interesting on how scientists work here.
[00:52:14.520 --> 00:52:22.360] You're writing about this ossuary collection of bones in Hallstatt, Austria, called the Hallstatt-Beinhaas.
[00:52:22.360 --> 00:52:24.840] Beinhaas is German for house of bones.
[00:52:24.840 --> 00:52:31.560] It's an idyllic village positioned on the shore of Lake Hallstadt, surrounded by the mountains.
[00:52:32.120 --> 00:52:36.280] In the 1700s, the local church started a less than idyllic tradition.
[00:52:36.280 --> 00:52:40.760] They started to dig up corpses of people who died in the preceding 10 to 15 years.
[00:52:40.760 --> 00:52:42.600] They were running out of space.
[00:52:42.600 --> 00:52:50.400] And so family members of the deceased were asked to stack up their loved ones' bones inside the house, now known as the Hallstatt Beinhaus.
[00:52:50.560 --> 00:52:56.400] The bones of relatives would be placed next to kin, thereby starting a genealogical record.
[00:52:56.400 --> 00:53:03.520] From 1720 onward, the skulls were adorned with symbolic decorations as well as the dates of birth and death.
[00:53:03.520 --> 00:53:13.360] The practice of digging up the remains and keeping the bones in the Binehaus continued until the 1960s, providing interested scientists with a wealth of data.
[00:53:13.360 --> 00:53:14.400] I mean, that's astonishing.
[00:53:14.400 --> 00:53:15.680] Somebody would even find that.
[00:53:15.680 --> 00:53:17.040] Oh, here's a good data set.
[00:53:17.040 --> 00:53:19.120] All right, what did they do with it from there?
[00:53:20.080 --> 00:53:26.240] Well, first off, I read somewhere that it's also a tourist mecca.
[00:53:26.240 --> 00:53:31.200] So that's probably why the scientists found it in the first place.
[00:53:31.520 --> 00:53:38.000] So what they did is I wanted to see what the heritability is of components of the skull.
[00:53:38.000 --> 00:53:54.000] So as I mentioned earlier, when we had this weird genetic fluke that prevented, or not prevented, that gave us more neurons because the neuronal stem cell started to divide more and more and more, producing more neurons.
[00:53:54.000 --> 00:53:55.440] The skull had to adapt.
[00:53:55.440 --> 00:54:07.360] So what these people wanted to find out is if you change one, if there's selection on one component of the skull, does that translate to the whole skull changing?
[00:54:07.360 --> 00:54:20.640] So basically, is the skull a collection of independent parts on which selection acts independently, or is it basically one structure?
[00:54:20.640 --> 00:54:25.200] So, selection acts on the same, on the whole structure.
[00:54:25.200 --> 00:54:27.920] And so, they looked at different components.
[00:54:28.160 --> 00:54:42.440] So, this base of the neck, basically, and all the different components of the skull, all the little bones that make the skull.
[00:54:43.080 --> 00:54:54.600] And they basically looked at individuals that were related, compared those different components, and looked at the heritability.
[00:54:54.600 --> 00:55:01.480] And what they found was that the skull basically changes as one.
[00:55:01.480 --> 00:55:08.920] So, if selection acts on one particular case, one particular part of the skull, the whole skull changes.
[00:55:08.920 --> 00:55:19.080] And that fitted my story because it means that when you have this expanding brain, that puts pressure on the base of the skull.
[00:55:19.080 --> 00:55:31.960] And that then, if you put selection on one part of the skull, the whole skull is able to change to compensate or to allow that large brain to grow.
[00:55:31.960 --> 00:55:37.240] And that's why we ended up with this very bulbous head, which you don't see when you have hair.
[00:55:37.240 --> 00:55:41.000] But my grandchild, of course, she doesn't have any hair.
[00:55:41.480 --> 00:55:44.760] She really has a very strange, large head.
[00:55:44.760 --> 00:55:51.880] It's funny that, you know, once you start reading and writing about these things, you notice it in little babies.
[00:55:51.880 --> 00:55:59.080] Babies have strange heads, and that head is all to do with having to fit that large brain in.
[00:55:59.080 --> 00:56:09.480] So they found good evidence that there's a genetic, that the it's heritable, which means that selection can act on the skull.
[00:56:09.480 --> 00:56:15.120] And then through natural selection, the skull can basically change, adapt.
[00:56:15.120 --> 00:56:15.600] Right.
[00:56:14.680 --> 00:56:16.240] So interesting.
[00:56:16.400 --> 00:56:18.720] Is that pleiotropy, I think?
[00:56:18.720 --> 00:56:25.760] Where you select for one particular characteristic and a bunch of others change with it that you don't even know or anticipate?
[00:56:26.080 --> 00:56:26.880] Exactly.
[00:56:26.880 --> 00:56:41.440] Like the silver foxes in Russia, that famous experiment, where they selected for docility, defined by, you know, how close can I get to the little silver fox before it nips at me, all the way up to sitting in my lap and I'm petting it.
[00:56:42.080 --> 00:56:54.960] But they also developed floppy ears and the little spot on the forehead and a curly tail and, you know, a bunch of other things that came along with the selection for just that one thing, docility.
[00:56:55.920 --> 00:57:02.400] Well, that's self-domestication, what Wrangham uses in his book, The Goodness Paradox.
[00:57:02.640 --> 00:57:03.280] Right, right.
[00:57:03.280 --> 00:57:20.480] So if you, so he claims that humans self-domesticate it as well, because, well, so the paradox, which I actually think, I think the idea is really good, interesting, although I don't think that that was the driving force behind our sociality.
[00:57:20.480 --> 00:57:36.080] So he claims that we had to, in order to have stable social groups, we had to get rid of people who don't adhere to the rules, say bullies or whatever.
[00:57:37.040 --> 00:57:43.120] So there was selection against highly aggressive individuals.
[00:57:44.720 --> 00:57:47.680] So that's basically the idea behind self-domestication.
[00:57:47.920 --> 00:57:51.680] You become nicer and kinder to other individuals.
[00:57:51.680 --> 00:57:56.880] But in order to get rid of those nasty pieces of work, they had to be killed.
[00:57:57.040 --> 00:58:09.000] So you had to also at the same time have to have the ability to extreme violence because you need executioners or people who kill other people in order for them not to kill other people, if you see what I mean.
[00:58:09.000 --> 00:58:10.520] So that's his paradox.
[00:58:10.520 --> 00:58:12.280] So the goodness paradox.
[00:58:12.280 --> 00:58:26.120] So he says, he claims that we self-domesticate it so that we are nice to individuals, even individuals to whom we are not related, which of course is the big problem, if you wish, in evolutionary biology.
[00:58:26.120 --> 00:58:32.040] Why are individuals of any species nice to someone with whom they don't share any genes?
[00:58:32.040 --> 00:58:36.360] Not so much nice, but why do they behave in an altruistic sense?
[00:58:36.360 --> 00:58:42.280] So have a behavior that's costly to themselves, but benefits someone else.
[00:58:42.280 --> 00:58:54.680] Darwin explained that by interacting with family, because if you do something to a family member at the expense of yourself, you're still helping your genes to be transmitted to the next generation.
[00:58:54.680 --> 00:59:00.920] But if the receiver or the recipient of your help is not related, then that's a problem.
[00:59:00.920 --> 00:59:07.480] And that's a problem that Rangham explains by self-domestications.
[00:59:07.480 --> 00:59:23.320] What I do find interesting is in all instances of where you find self-domestication across the animal kingdom, for example, you mentioned the silphal foxes, but dogs, etc., they all revert to a particular phenotype.
[00:59:23.960 --> 00:59:32.920] So that is, yes, that is pleiotropy that you select for one thing, and then you get this whole string of other characteristics.
[00:59:32.920 --> 00:59:33.320] Yeah.
[00:59:33.640 --> 00:59:43.560] This was always my criticism of the Nobel Prize sperm bank that that guy Graham created in San Diego.
[00:59:43.520 --> 00:59:47.440] And he's just selecting for one thing, you know, basically your score on an IQ test.
[00:59:47.440 --> 00:59:48.000] But what else?
[00:59:48.240 --> 00:59:50.240] But he was the only one that had sperm in it, wasn't he?
[00:59:44.920 --> 00:59:51.440] I don't think there was anyone.
[00:59:51.760 --> 00:59:53.280] There were two or three others.
[00:59:54.400 --> 00:59:57.600] Almost everybody he wrote, all Nobel Prize winners turned him down.
[00:59:57.600 --> 00:59:58.880] But yeah, there were a couple.
[00:59:58.880 --> 01:00:00.560] But it was mainly his, yeah.
[01:00:01.200 --> 01:00:06.560] Well, but the argument against it in principle is that you need genetic diversity.
[01:00:06.560 --> 01:00:20.320] And by being so careful in your eugenical selection, you're going to either get a bunch of weird stuff you're not looking for, or you'll miss out on some diversity that's actually good for the species or good for your nation or society.
[01:00:21.280 --> 01:00:36.400] Yeah, it's even more scary though, because if you believe Jeff Miller's the mating mind, so in it, so he shows this nice bell curve with intelligence for women and for men.
[01:00:36.400 --> 01:00:44.800] And the bell curves, the average totally overlaps, or most of the bell curve actually overlaps, but the tails are different.
[01:00:45.120 --> 01:00:49.680] So the tails for men are longer on both sides.
[01:00:49.680 --> 01:00:55.360] So you have more stupid, extremely stupid.
[01:00:55.680 --> 01:01:05.040] And his argument comes back from selection on so the male brain is, or the brain is successful.
[01:01:05.040 --> 01:01:06.880] Every business has an ambition.
[01:01:06.880 --> 01:01:16.320] PayPal Open is the platform designed to help you grow into yours with business loans so you can expand and access to hundreds of millions of PayPal customers worldwide.
[01:01:16.320 --> 01:01:22.640] And your customers can pay all the ways they want with PayPal, Venmo, Paylater, and all major cards.
[01:01:22.640 --> 01:01:24.720] So you can focus on scaling up.
[01:01:24.720 --> 01:01:29.520] When it's time to get growing, there's one platform for all business: PayPal Open.
[01:01:29.520 --> 01:01:32.200] Grow today at PayPalopen.com.
[01:01:29.760 --> 01:01:34.440] Loan subject to approval in available locations.
[01:01:34.760 --> 01:01:51.080] So you select a trait, which means that you, at the same time, you select for high intelligence, but the cost is that at the same time, I guess also through pleiotropy or whatever, you select for weird characteristics.
[01:01:51.080 --> 01:01:56.120] And he explains the higher prevalence of things like autism, etc.
[01:01:57.000 --> 01:01:59.720] is often linked to high intelligence.
[01:01:59.720 --> 01:02:05.160] So you may then have sperm from a Nobel Prize winner, but you don't know anything about this individual.
[01:02:05.160 --> 01:02:11.480] So you may end up with the psychopath as a child or someone with autism.
[01:02:12.120 --> 01:02:13.960] Anyway, we're kind of off topic there.
[01:02:13.960 --> 01:02:14.440] I wanted to.
[01:02:14.680 --> 01:02:22.040] Well, oh, so I was going to mention Trevor's theory of reciprocal altruism as why you would be solving the problem of altruism.
[01:02:22.040 --> 01:02:31.960] You know, I'm being nice to a stranger because, well, stranger, somebody in the other village, so that they'll be nice to me when times are hard for me or they're less likely to attack me if I was nice to them.
[01:02:31.960 --> 01:02:33.480] Something like that.
[01:02:34.440 --> 01:02:36.760] Yeah, no, that's a good example.
[01:02:37.400 --> 01:02:46.840] But then, of course, you do need to live in stable groups and you have to be able to recognize the other individual and remember that that person owes you something.
[01:02:47.240 --> 01:02:48.360] So I give an example.
[01:02:48.360 --> 01:02:55.640] It's an example I used to give in an animal behavior lecture of two young chimpanzee males.
[01:02:55.640 --> 01:02:58.760] So chimpanzees live in heroin group.
[01:02:58.760 --> 01:03:02.200] So you have the dominant male and mates with all the females.
[01:03:02.440 --> 01:03:05.000] But of course, there's the equal sex ratio.
[01:03:05.000 --> 01:03:18.000] So you always have young males, which at some stage will be either they replace the alpha male or they leave the group and try and establish a group themselves.
[01:03:18.320 --> 01:03:25.360] So this is an example that I found somewhere where there's two young males who want to mate with another female.
[01:03:25.360 --> 01:03:37.200] The female wants to mate with them, but that's tricky because if the dominant female finds out that his underlings are mating with one of his females, he won't be happy.
[01:03:37.200 --> 01:03:39.200] So they strike this deal.
[01:03:39.200 --> 01:03:44.800] So one of them distracts the male by, I don't know, throwing sticks at him or something.
[01:03:44.800 --> 01:03:52.480] The dominant male becomes completely enraged and races after the youngster that's teasing him.
[01:03:52.480 --> 01:03:57.840] The youngster runs in the opposite direction, so the other young male can mate with the female.
[01:03:57.840 --> 01:03:59.680] Next time, they reverse.
[01:03:59.680 --> 01:04:05.120] So now it's the one that mated the first time is now teasing the dominant male.
[01:04:05.120 --> 01:04:06.320] So they reciprocate.
[01:04:06.320 --> 01:04:11.200] And I think that's a great example of reciprocation.
[01:04:11.200 --> 01:04:20.160] But of course, it only works, as I said earlier, if you remember the other individual and if you can come up with a scheme in the first place.
[01:04:20.160 --> 01:04:20.720] Right.
[01:04:21.280 --> 01:04:23.760] So you have to have high social skills in order to do that.
[01:04:23.920 --> 01:04:27.680] Well, there's those vampire bats that share blood, right?
[01:04:28.000 --> 01:04:32.480] So they're keeping track somehow of who was a good cooperator and who wasn't.
[01:04:33.120 --> 01:04:33.680] Yes.
[01:04:33.680 --> 01:04:36.800] And I think, don't they always roost in the same place?
[01:04:36.800 --> 01:04:38.320] They have a communal roost at night.
[01:04:38.720 --> 01:04:39.040] Right.
[01:04:39.040 --> 01:04:39.680] Yeah.
[01:04:39.680 --> 01:04:40.080] Yeah.
[01:04:40.800 --> 01:04:43.360] And they must be related.
[01:04:43.680 --> 01:04:52.240] Well, and again, I mean, so our sense of morality just builds on something that came from our ancestors that you can still find in chimpanzees and other social animals.
[01:04:52.240 --> 01:04:53.120] Yeah, yeah, yeah.
[01:04:53.440 --> 01:05:00.680] So, it's not something that's uniquely human again and that's God-given or whatever, since I'm also an atheist.
[01:05:00.680 --> 01:05:01.160] Yeah.
[01:04:59.840 --> 01:05:05.880] A couple of the things you write about here that made me think of some interesting topics.
[01:05:06.200 --> 01:05:08.680] Mind reading, theory of mind.
[01:05:08.840 --> 01:05:15.720] You mentioned Alison Gopnik has that famous experiment from the 80s with the little box of candles or crayons.
[01:05:15.720 --> 01:05:20.200] The box of crayons, you show a three-year-old a box of crayons, you know, what's in there, crayons.
[01:05:20.200 --> 01:05:23.480] So she opens it up, and there's candles in the box of crayons.
[01:05:23.560 --> 01:05:25.160] Like, oh, surprise.
[01:05:25.160 --> 01:05:32.840] Now you put it away, and then you bring in another child, and you ask the one that knows what's in there, you know, what's he going to think is in there?
[01:05:32.840 --> 01:05:33.800] Oh, candles.
[01:05:33.800 --> 01:05:35.000] Like, no.
[01:05:35.640 --> 01:05:42.760] You know, and then by age four or five, I guess they develop a theory of mind and they're able to pass the candle, the theory of mind test, something like that.
[01:05:42.760 --> 01:05:46.920] So that's an interesting twist on this whole origins of language.
[01:05:46.920 --> 01:05:53.960] At some point, maybe age four, I think Allison said, is when mind reading comes on online.
[01:05:55.480 --> 01:05:57.880] It's not so much mind reading, though, is it?
[01:05:58.200 --> 01:06:03.800] It's understanding what the other person is thinking, which I don't think is quite the same as mind reading.
[01:06:03.800 --> 01:06:09.160] Because if you look at some of the work of Tomasello's work on little toddlers.
[01:06:09.320 --> 01:06:10.280] Yeah, you wrote about that.
[01:06:10.280 --> 01:06:11.960] Yeah, that was super interesting.
[01:06:12.360 --> 01:06:15.720] These videos are so lovely.
[01:06:17.880 --> 01:06:23.640] So you have the caretaker sits in a corner, just quietly doing her thing.
[01:06:23.960 --> 01:06:29.080] The little toddler is exploring the area.
[01:06:29.080 --> 01:06:30.360] And then a man comes in.
[01:06:30.360 --> 01:06:34.280] A stranger comes in carrying a pile of books and papers.
[01:06:34.280 --> 01:06:36.440] And there's a cabinet with a closed door.
[01:06:36.440 --> 01:06:44.040] So the stranger walks towards the cabinet and pushes the pile of books and papers against the closed door.
[01:06:44.040 --> 01:06:47.840] And this toddler is thinking, What's this man doing?
[01:06:48.160 --> 01:06:53.360] Because doesn't he know you have to open the door before you can put the b
Prompt 2: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 3: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Prompt 5: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 2 of 2 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
x of crayons.
[01:05:23.560 --> 01:05:25.160] Like, oh, surprise.
[01:05:25.160 --> 01:05:32.840] Now you put it away, and then you bring in another child, and you ask the one that knows what's in there, you know, what's he going to think is in there?
[01:05:32.840 --> 01:05:33.800] Oh, candles.
[01:05:33.800 --> 01:05:35.000] Like, no.
[01:05:35.640 --> 01:05:42.760] You know, and then by age four or five, I guess they develop a theory of mind and they're able to pass the candle, the theory of mind test, something like that.
[01:05:42.760 --> 01:05:46.920] So that's an interesting twist on this whole origins of language.
[01:05:46.920 --> 01:05:53.960] At some point, maybe age four, I think Allison said, is when mind reading comes on online.
[01:05:55.480 --> 01:05:57.880] It's not so much mind reading, though, is it?
[01:05:58.200 --> 01:06:03.800] It's understanding what the other person is thinking, which I don't think is quite the same as mind reading.
[01:06:03.800 --> 01:06:09.160] Because if you look at some of the work of Tomasello's work on little toddlers.
[01:06:09.320 --> 01:06:10.280] Yeah, you wrote about that.
[01:06:10.280 --> 01:06:11.960] Yeah, that was super interesting.
[01:06:12.360 --> 01:06:15.720] These videos are so lovely.
[01:06:17.880 --> 01:06:23.640] So you have the caretaker sits in a corner, just quietly doing her thing.
[01:06:23.960 --> 01:06:29.080] The little toddler is exploring the area.
[01:06:29.080 --> 01:06:30.360] And then a man comes in.
[01:06:30.360 --> 01:06:34.280] A stranger comes in carrying a pile of books and papers.
[01:06:34.280 --> 01:06:36.440] And there's a cabinet with a closed door.
[01:06:36.440 --> 01:06:44.040] So the stranger walks towards the cabinet and pushes the pile of books and papers against the closed door.
[01:06:44.040 --> 01:06:47.840] And this toddler is thinking, What's this man doing?
[01:06:48.160 --> 01:06:53.360] Because doesn't he know you have to open the door before you can put the books in the shelf?
[01:06:53.360 --> 01:06:55.680] So I think it can't even walk yet.
[01:06:55.680 --> 01:07:04.560] So it crawls over to the cabinet, stands up, opens the door, and then really looks at this stranger.
[01:07:04.560 --> 01:07:06.080] Is this what you want?
[01:07:07.120 --> 01:07:08.880] So that's mind reading.
[01:07:08.880 --> 01:07:09.280] Yeah.
[01:07:09.600 --> 01:07:11.040] It's just a different level.
[01:07:11.440 --> 01:07:14.480] So I think it's well, yeah.
[01:07:14.480 --> 01:07:19.600] So mind reading starts from the beginning, I think, almost when they're born, because they have to.
[01:07:20.080 --> 01:07:28.000] Because they have to know, know, if I do this, then I get what I need.
[01:07:28.960 --> 01:07:33.360] And what they need, of course, is really important because they're totally dependent.
[01:07:34.000 --> 01:07:42.160] So I can see that there's a jump, what you just explained, the jump in comprehension.
[01:07:42.160 --> 01:07:48.640] They can't know what's in the books because they haven't looked in the books, and the book says it's crayons, but I know there's candles in it.
[01:07:48.640 --> 01:07:50.720] So that's a jump in development.
[01:07:50.720 --> 01:07:53.520] But the mind reading starts much earlier.
[01:07:53.600 --> 01:07:55.040] Yeah, that's a good point.
[01:07:55.040 --> 01:07:55.760] Right.
[01:07:55.760 --> 01:07:57.280] Yeah, you've probably seen that research.
[01:07:57.280 --> 01:08:02.960] I think it's Greg Burns on dogs, mind reading versus chimps.
[01:08:02.960 --> 01:08:08.480] So, like, dogs evolved with humans in social groups, so they learn to read the cues from humans.
[01:08:08.480 --> 01:08:13.040] So the experiment is they have like an opaque little box with food in it or not.
[01:08:13.040 --> 01:08:16.400] And, you know, the experimenter points or just looks.
[01:08:16.400 --> 01:08:25.040] And the dogs are better at reading the pointing or the eye gaze than the chimps are of the humans because they learned, they evolved to be able to do that.
[01:08:25.040 --> 01:08:25.760] Yep.
[01:08:25.760 --> 01:08:26.800] That's a kind of mind reading.
[01:08:27.120 --> 01:08:28.640] Dogs are pretty good at mind reading.
[01:08:29.200 --> 01:08:31.080] Oh, my dog is totally into it.
[01:08:31.080 --> 01:08:32.760] Yeah, he knows what I'm going to do.
[01:08:32.760 --> 01:08:38.200] If I get my bike out, if I get my bike shoes out, he knows, oh, we're not going for a walk.
[01:08:38.360 --> 01:08:39.640] He crawls back to his little base.
[01:08:40.280 --> 01:08:43.560] But if I got my tennis shoes, he's like, okay, here we go.
[01:08:44.520 --> 01:08:50.440] No, but again, because the dog can't open the can of dog food, so they need the human.
[01:08:50.440 --> 01:08:53.480] So they're well attuned to understanding what the humans want.
[01:08:53.480 --> 01:08:54.120] Right.
[01:08:54.120 --> 01:08:54.440] All right.
[01:08:54.440 --> 01:08:55.800] You mentioned the film Arrival.
[01:08:55.800 --> 01:09:03.240] I love that film because finally it was a social scientist that solved the problem rather than a physicist of communicating with the aliens.
[01:09:05.080 --> 01:09:20.120] So, but that brings to mind, you know, sometimes presumably if we discovered ETIs, like SETI was successful, they would probably have some form of communication because you have to have a social species to be able to master space travel and so on.
[01:09:20.120 --> 01:09:22.920] So, therefore, you need some kind of communication.
[01:09:23.560 --> 01:09:31.320] I would say it has to be very sophisticated communication because otherwise they couldn't organize themselves or build spacecraft, etc.
[01:09:31.560 --> 01:09:32.200] Right.
[01:09:32.200 --> 01:09:32.680] Yeah.
[01:09:33.320 --> 01:09:35.000] But how would you know how to communicate?
[01:09:35.000 --> 01:09:36.200] That's always an interesting problem.
[01:09:36.200 --> 01:09:42.440] Remember the group that Sagan was a part of this group, the group, the dolphin, what was it?
[01:09:43.800 --> 01:09:45.880] There was a group of dolphin studiers.
[01:09:45.880 --> 01:09:46.440] Oh, John C.
[01:09:46.520 --> 01:09:50.120] Lilly's group, in which they were trying to communicate with dolphins.
[01:09:50.120 --> 01:09:56.040] This is when Sagan was, you know, first took up the idea of looking for aliens and discovered we don't even know what the dolphins are saying.
[01:09:56.040 --> 01:09:58.200] How are we going to know what the aliens are saying?
[01:09:59.160 --> 01:10:03.840] It's part of the experiment where this woman lifts with a dolphin in a half watched house.
[01:10:04.360 --> 01:10:04.760] Yes.
[01:10:05.160 --> 01:10:05.800] Yeah.
[01:10:05.800 --> 01:10:14.280] Yeah, I think the dolphin got kind of fond of her, sort of like Nimchimsky getting fond of the female trainer and jealous of the males around her and so on.
[01:10:14.280 --> 01:10:18.080] Yes, I remember a podcast again, yes, about that.
[01:10:18.080 --> 01:10:18.560] Oh, you do?
[01:10:18.560 --> 01:10:19.440] Okay, yeah, interesting.
[01:10:19.440 --> 01:10:20.320] You listen to podcasts.
[01:10:20.400 --> 01:10:21.680] Yeah, I think it was a radio lab.
[01:10:14.840 --> 01:10:22.240] It's been a while.
[01:10:22.480 --> 01:10:23.200] Oh, all right.
[01:10:23.200 --> 01:10:23.840] Right.
[01:10:24.160 --> 01:10:29.120] But then Lily gave dolphins LSD, and that kind of messed up the experiment, I guess.
[01:10:29.120 --> 01:10:30.240] I wonder why.
[01:10:32.080 --> 01:10:33.120] All right, finally, what?
[01:10:33.120 --> 01:10:34.400] Oh, no, two last things.
[01:10:34.400 --> 01:10:39.520] One on you, you mentioned AI and large language models and so on.
[01:10:40.240 --> 01:10:44.160] Is AI developing a new form of language and communication?
[01:10:45.120 --> 01:10:51.920] I can't see how because it's basically plagiarizing what we've done or what we're doing.
[01:10:52.560 --> 01:10:56.240] I actually read, she's an author.
[01:10:56.240 --> 01:10:59.120] I can't remember who it is.
[01:10:59.120 --> 01:11:00.160] Anna Funder.
[01:11:00.160 --> 01:11:03.120] Yes, I read an essay by her the other day.
[01:11:03.120 --> 01:11:13.440] And well, first off, she's very cross that all her work is basically being used to train AI without authors having given permission.
[01:11:13.440 --> 01:11:17.040] And I think that's the case for all authors because no one's given permission.
[01:11:17.040 --> 01:11:24.880] But if your work is available online, then AI can use your work to train.
[01:11:24.880 --> 01:11:29.280] So she makes that point that that is actually rather unethical.
[01:11:29.600 --> 01:11:43.520] And then the other point that she makes is: when I write, I'm not speaking as an afunder, when I write, I always try and come up with words that you don't normally see in this particular order.
[01:11:43.520 --> 01:11:58.720] Because what AI does, it will always use the words in a particular order, the words that are most frequently used in that particular order, because that's what it's how it's trained, because it looks at all the commonalities.
[01:11:59.040 --> 01:12:11.400] So she makes the really good argument that AI will never be able to come up with some creative way of using language because that's just not how the algorithm works.
[01:12:11.400 --> 01:12:17.000] For that to work, you have to then assume that it's going to become conscious, whatever that means.
[01:12:17.000 --> 01:12:26.120] I have a real difficulty with the whole word of consciousness and become something akin to something alive.
[01:12:26.120 --> 01:12:27.640] And I don't see that happen.
[01:12:27.640 --> 01:12:28.280] Yeah.
[01:12:28.520 --> 01:12:33.240] Okay, does it take a village that is raising children has always required a community?
[01:12:33.240 --> 01:12:40.200] The nuclear family, fueled by cultural change and capitalism, isolates parents, removing the ancestral support system.
[01:12:40.200 --> 01:12:45.560] Without grandparents or extended family nearby, we must find new ways to support parents and children.
[01:12:45.560 --> 01:12:46.040] Okay.
[01:12:46.040 --> 01:12:57.320] So just first, is there something biological or evolutionary about the nuclear family or a male and a female and their children or something like that?
[01:12:57.320 --> 01:13:01.560] Or is that more of a cultural invention in the early modern period?
[01:13:01.560 --> 01:13:03.720] Or how do you think about all that?
[01:13:04.680 --> 01:13:12.760] Well, of course, the family is a unit of selection, if you wish, because they're all related, so they all share genes.
[01:13:12.760 --> 01:13:15.320] So in that respect, it's a biological entity.
[01:13:15.320 --> 01:13:22.760] But I think what I'm more referring to is the idea that mothers in particular are the ones that have to raise their children.
[01:13:22.760 --> 01:13:29.880] And that comes from the Industrial Revolution, where dad goes out to work and the mum stays at home.
[01:13:30.200 --> 01:13:35.240] But also if you and we still s don't seem to be able to quite get over that.
[01:13:35.240 --> 01:13:45.000] So if you look at new developments, so here in Australia, because of huge immigration rates, they're constantly building new houses.
[01:13:45.120 --> 01:13:54.640] It's also because Australians want to have their detached home on a, I don't know, two and a half acre block or whatever, they say.
[01:13:54.640 --> 01:13:59.600] So we have this sprawling suburbs which are not connected to anything.
[01:13:59.600 --> 01:14:03.840] There's no schools, there's no shops, there's no place to work.
[01:14:03.840 --> 01:14:05.520] So what happens?
[01:14:05.520 --> 01:14:20.480] Mainly the men, mostly the men, they go off to work and the woman is alone with the child or the children, totally isolated from anyone else because, well, I guess she's got other mothers and children in the same neighbourhood.
[01:14:20.480 --> 01:14:21.840] That is what I refer.
[01:14:21.920 --> 01:14:25.360] That's what I think is totally unnatural.
[01:14:26.160 --> 01:14:31.520] Because, you know, the kids have to be able to run out and bump into other kids.
[01:14:31.520 --> 01:14:35.360] There are other individuals that they interact with.
[01:14:35.680 --> 01:14:45.840] Of course, need a village to raise a child is a point that Sarah Hurdie, Blaffer Hurdy, made in her books.
[01:14:46.000 --> 01:14:46.560] Mother and Mario.
[01:14:46.720 --> 01:14:49.440] But I think originally Mother Nature.
[01:14:49.760 --> 01:14:54.320] But I think it mainly Mothers and Others is another book that I really enjoyed.
[01:14:55.200 --> 01:15:03.200] But I think it comes from an older saying that she co-opted Inuits, maybe?
[01:15:03.200 --> 01:15:04.320] I can't quite remember.
[01:15:04.320 --> 01:15:09.440] I used to have it on the slide in one of my lectures, but I can't remember.
[01:15:12.400 --> 01:15:18.160] It's yeah, so the natural family, if you wish, I think are kin groups.
[01:15:18.160 --> 01:15:20.560] So, extended families.
[01:15:21.200 --> 01:15:23.120] So, that's how Lucy, etc.
[01:15:23.440 --> 01:15:24.720] Probably started.
[01:15:24.720 --> 01:15:25.120] Yeah.
[01:15:25.440 --> 01:15:26.000] Yeah.
[01:15:26.040 --> 01:15:27.280] Well, aunties.
[01:15:27.520 --> 01:15:30.040] Right, that gets back to the thesis of your book.
[01:15:30.040 --> 01:15:32.040] That's what you need language for, right?
[01:15:32.360 --> 01:15:33.160] All right, Madeline.
[01:15:33.160 --> 01:15:33.560] That's great.
[01:15:29.440 --> 01:15:34.360] That's a good place to stop.
[01:15:34.520 --> 01:15:37.480] The origin of language: how we learn to speak and why.
[01:15:37.480 --> 01:15:38.440] There it is.
[01:15:45.480 --> 01:15:48.360] This episode is brought to you by Progressive Insurance.
[01:15:48.360 --> 01:15:50.920] You chose to hit play on this podcast today.
[01:15:50.920 --> 01:15:51.880] Smart choice.
[01:15:51.880 --> 01:15:58.120] Make another smart choice with AutoQuote Explorer to compare rates from multiple car insurance companies all at once.
[01:15:58.120 --> 01:15:59.880] Try it at progressive.com.
[01:15:59.880 --> 01:16:01.960] Progressive casualty insurance company and affiliates.
[01:16:01.960 --> 01:16:03.720] Not available in all states or situations.
[01:16:03.720 --> 01:16:05.640] Prices vary based on how you buy.
[01:16:05.640 --> 01:16:11.080] What I would say to someone who's considering FDU is to come because they give great scholarship opportunities.
[01:16:11.080 --> 01:16:12.840] The professors are very helpful.
[01:16:12.840 --> 01:16:15.720] They want you to learn and they want you to pass.
[01:16:15.720 --> 01:16:18.280] And FDU gives you a second family.
[01:16:18.280 --> 01:16:21.320] Seize the moment and change your world at FDU.
Prompt 6: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 7: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Full Transcript
[00:00:00.480 --> 00:00:04.560] Ever notice how ads always pop up at the worst moments?
[00:00:04.560 --> 00:00:08.640] When the killer's identity is about to be revealed.
[00:00:08.640 --> 00:00:12.320] During that perfect meditation flow.
[00:00:12.320 --> 00:00:16.240] On Amazon Music, we believe in keeping you in the moment.
[00:00:16.240 --> 00:00:26.080] That's why we've got millions of ad-free podcast episodes so you can stay completely immersed in every story, every reveal, every breath.
[00:00:26.080 --> 00:00:33.120] Download the Amazon Music app and start listening to your favorite podcasts ad-free, included with Prime.
[00:00:33.760 --> 00:00:38.800] FanDuel's getting you ready for the NFL season with an offer you won't want to miss.
[00:00:38.800 --> 00:00:44.640] New customers can bet just $5 and get $300 in bonus points if you win.
[00:00:44.640 --> 00:00:45.360] That's right.
[00:00:45.360 --> 00:00:53.440] Pick a bet, bet $5, and if it wins, you'll unlock $300 in bonus bets to use all across the app.
[00:00:53.440 --> 00:01:00.240] You can build parlays, bet player props, ride the live lines, whatever your style, FanDuel has you covered.
[00:01:00.240 --> 00:01:02.240] The NFL season is almost here.
[00:01:02.240 --> 00:01:04.880] The only question is: are you ready to play?
[00:01:04.880 --> 00:01:10.640] Visit FanDuel.com slash bet big to download the FanDuel app today and get started.
[00:01:10.640 --> 00:01:19.600] Must be 21 plus and present in select states for Kansas in affiliation with Kansas Star Casino or 18 plus and present in DC, Kentucky, or Wyoming.
[00:01:19.600 --> 00:01:21.680] First online real money wager only.
[00:01:21.680 --> 00:01:23.520] $5 first deposit required.
[00:01:23.520 --> 00:01:28.160] Bonus issued as non-withdrawable bonus bets, which expire seven days after receipt.
[00:01:28.160 --> 00:01:29.280] Restrictions apply.
[00:01:29.280 --> 00:01:32.160] See terms at sportsbook.fanuel.com.
[00:01:32.160 --> 00:01:33.120] Gambling problem?
[00:01:33.120 --> 00:01:37.040] Call 1-800-GAMBLER or visit fanduel.com slash RG.
[00:01:37.040 --> 00:01:44.080] Call 1-888-789-7777 or visit cpg.org slash chat in Connecticut.
[00:01:44.080 --> 00:01:47.040] Or visit mdgamblinghelp.org in Maryland.
[00:01:47.040 --> 00:01:47.760] Hope is here.
[00:01:47.760 --> 00:01:50.160] Visit gamblinghelpline ma.org.
[00:01:50.160 --> 00:01:55.440] Or call 800-327-5050 for 24-7 support in Massachusetts.
[00:01:55.440 --> 00:02:01.240] Or call 1-877-8 HOPE-N-Y or text H-O-P-E-N-Y in New York.
[00:02:04.760 --> 00:02:10.280] You're listening to The Michael Shermer Show.
[00:02:16.680 --> 00:02:19.240] All right, so your book, The Origin of Language.
[00:02:19.240 --> 00:02:31.240] I love this subject because I love origin questions: the origin of the universe, the origins of life, the origins of complex life, the origins of humanity, the origins of language, morality.
[00:02:31.240 --> 00:02:32.120] That's pretty much it.
[00:02:32.120 --> 00:02:34.920] Those are the big questions that I think scientists should tackle.
[00:02:34.920 --> 00:02:37.000] So, you know, good on you for taking this on.
[00:02:37.000 --> 00:02:38.120] Give us a little bit of background.
[00:02:38.520 --> 00:02:39.240] What's your story?
[00:02:39.240 --> 00:02:40.680] How'd you get into this?
[00:02:41.320 --> 00:02:42.120] Well, it's interesting.
[00:02:42.360 --> 00:02:53.720] So I'm an experimental biologist, and well, as you mentioned, I got a fellowship at the Institute of Advanced Studies in Berlin, which is just a think tank, basically.
[00:02:53.880 --> 00:02:56.440] You can't do any experimental work.
[00:02:56.760 --> 00:02:59.080] So I decided I was going to write a book.
[00:02:59.080 --> 00:03:03.000] And I had this very ambitious project in mind.
[00:03:03.000 --> 00:03:09.080] And the book was going to be called The What Was it called?
[00:03:09.880 --> 00:03:10.680] It doesn't matter.
[00:03:10.680 --> 00:03:11.960] Oh, How Evolution Works.
[00:03:11.960 --> 00:03:13.000] Yes, How Evolution Works.
[00:03:13.720 --> 00:03:17.880] And of course, I didn't know that Ernst Meyer had already written a book with that title.
[00:03:18.040 --> 00:03:22.760] Of course, he's a much more famous evolutionary biologist than I will ever be.
[00:03:23.000 --> 00:03:29.480] But at the same time, I also decided that I was getting a bit tired of writing academic papers.
[00:03:29.480 --> 00:03:34.280] You know, you write a paper and about five people read them, and that's about it.
[00:03:34.280 --> 00:03:39.640] And I thought it would be nice to try and write for a more general audience.
[00:03:39.640 --> 00:03:50.640] But the main reason why I wanted to write the book How Evolution Works is because I always looked at organisms as, you know, from sort of a Darwin perspective.
[00:03:50.880 --> 00:03:55.040] You look at the individual, you try and understand why they behave and the way they behave.
[00:03:55.200 --> 00:04:00.800] Always worked on insects, mainly honeybees and ants.
[00:04:01.120 --> 00:04:15.840] And I thought, well, you know, everyone's talking about the genomes that, of course, the Human Genome Project had come out, which was going to reveal everything that's special about humans, which, of course, it didn't.
[00:04:16.160 --> 00:04:21.120] And then I realized, well, I don't actually understand how you marry the two levels.
[00:04:21.120 --> 00:04:28.880] So you look, natural selection works at the level of the individual, but of course, the changes happen at the level of the genome.
[00:04:28.880 --> 00:04:38.640] And since I'm not a geneticist, and to be honest, I never had any interest in genetics or genomics whatsoever, I thought, well, this might be a good challenge.
[00:04:38.640 --> 00:04:43.680] So let's try and write a book that explains how evolution works.
[00:04:44.000 --> 00:04:46.800] But that's quite a big question.
[00:04:46.800 --> 00:04:50.000] I mean, if you look at Ansmeyer's book, it basically talks about everything.
[00:04:50.000 --> 00:04:53.520] And it's more an academic book, which I didn't want to write.
[00:04:53.520 --> 00:04:58.320] So I thought, well, what can I do to make it interesting to a general audience?
[00:04:58.320 --> 00:04:59.760] Well, what are people interested in?
[00:04:59.760 --> 00:05:02.400] They're interested in humans.
[00:05:02.720 --> 00:05:10.400] I had no interest in human evolution whatsoever, never thought about it, never read anything about it.
[00:05:10.400 --> 00:05:15.680] And then I realized, well, actually, it's amazingly interesting.
[00:05:15.680 --> 00:05:34.680] And then I bumped into all these different little changes, if you wish, because the other angle that I started to take was: if you look at chimpanzees and you look at humans, our genomes are, depending on how you measure it, 98% the same.
[00:05:35.000 --> 00:05:44.040] So, if genes are so important, how can we be genetically so identical and phenotypically and behaviorally, etc., so different?
[00:05:44.040 --> 00:05:47.240] So, that's basically where I started reading.
[00:05:47.240 --> 00:05:55.560] And then I found all these funny mistakes, if you wish, that made us who we are.
[00:05:55.880 --> 00:06:01.800] And then I stumbled on all the morphological changes that allowed us to speak.
[00:06:01.800 --> 00:06:11.960] And then I realized, well, natural selection works on increasing your reproductive output.
[00:06:12.280 --> 00:06:20.040] The way our morphology changed that allowed us to speak coincided with producing those babies, which are really premature.
[00:06:20.040 --> 00:06:35.480] So, putting one and one together, I thought, well, that must be the origin of language to raise babies that are so premature because we have such a large head, which at the same time caused a problem and also gave us the solution.
[00:06:35.480 --> 00:06:36.120] Nice.
[00:06:36.120 --> 00:06:36.680] Wow.
[00:06:36.680 --> 00:06:38.440] That's a perfect introduction.
[00:06:38.600 --> 00:06:48.280] Let me tease this up for you reading the opening paragraphs of my very first column in Scientific American, which I called Darwin's Dictum.
[00:06:48.280 --> 00:07:05.160] In 1861, less than two years after the publication of Charles Darwin's On the Origin of Species, in a session before the British Association for the Advancement of Science, a critic claimed that Darwin's book was too theoretical, and that he should have just put the facts before us and let them rest.
[00:07:05.160 --> 00:07:12.520] In a letter to his friend Henry Fawcett, who was in attendance in his defense, Darwin explained the proper relationship between facts and theory.
[00:07:12.520 --> 00:07:19.520] Here's Darwin: About 30 years ago, there was much talk that geologists ought only to observe and not theorize.
[00:07:19.520 --> 00:07:27.280] And I well remember someone saying that at this rate, a man might as well go into a gravel pit and count the pebbles and describe the colors.
[00:07:27.600 --> 00:07:37.200] How odd it is that anyone should not see that all observation must be for or against some view if it is to be of any service.
[00:07:37.520 --> 00:07:39.120] So I'm teeing this up for you.
[00:07:39.120 --> 00:07:43.280] What is your theory of language for or against?
[00:07:43.920 --> 00:07:48.160] Who has the other theories and what are they and how is yours different?
[00:07:48.800 --> 00:07:57.680] Well, that's interesting because as far as I know, no one came up with a theory for why, no, for how language came about.
[00:07:58.320 --> 00:08:07.840] So I mentioned the mating minds, the Jeff Miller's the mating minds, that the brain is basically a sexually selected organ.
[00:08:08.160 --> 00:08:16.160] It allows men in particular to be really clever and witty and to write poetry and novels, etc.
[00:08:16.480 --> 00:08:20.880] And that's all to do with seducing women, basically.
[00:08:21.520 --> 00:08:29.760] And of course, women also have to have a brain large enough to understand all the jokes because there's nothing worse than making a joke and no one's laughing.
[00:08:29.760 --> 00:08:31.600] So that's the mating mind.
[00:08:32.720 --> 00:08:36.080] Then, of course, Harari has written three books.
[00:08:36.080 --> 00:08:38.080] I haven't read the third one yet.
[00:08:38.400 --> 00:08:48.400] In which he explains human society, which is all based around fictions, fictions that we come up with that we all believe in.
[00:08:48.400 --> 00:08:54.720] And because we all believe in it, society becomes a sort of a cohesive entity.
[00:08:54.720 --> 00:09:04.760] And that, of course, assists humans because strong groups are stronger than the group next door, which is not so strong because they don't have these cohesive beliefs.
[00:09:04.840 --> 00:09:15.000] So, he gives examples like religion, things like money, and of course, we all know, you know, bitcoins are nothing real.
[00:09:15.000 --> 00:09:21.000] Actually, the stock market is not real, but we believe in all these things because we convince ourselves that that's what we believe in.
[00:09:21.240 --> 00:09:24.040] This stuff is worthless, it's just paper and ink.
[00:09:24.040 --> 00:09:24.840] Exactly.
[00:09:24.840 --> 00:09:35.560] And it's only worth something because you trust the system that when you go to the supermarket, you can actually buy something using just a piece of paper.
[00:09:35.560 --> 00:09:41.800] So, that's very convincing, and it explains nicely why societies are the way they are.
[00:09:41.800 --> 00:09:45.080] But he never explains where language came from in the first place.
[00:09:45.720 --> 00:09:49.400] Same with some of the other, sorry, I don't remember what it was.
[00:09:49.560 --> 00:09:54.200] Well, you had, yes, Terence Deacon's, the symbolic species.
[00:09:54.520 --> 00:10:02.280] I don't know if he offered a first origin, but at least the interaction once it gets rolling.
[00:10:02.280 --> 00:10:03.480] Right between brain structures.
[00:10:03.560 --> 00:10:04.440] Oh, the symbolic species.
[00:10:04.600 --> 00:10:13.560] Yeah, the brain structure and language each affect each other in an autocatalytic system, and they kind of develop together.
[00:10:13.880 --> 00:10:23.960] Yes, but he's the one who first came up with the idea of seeing language as a virus, a virus well adapted to the brains of babies, which I think is just brilliant.
[00:10:23.960 --> 00:10:28.600] Because I think, well, to me, that's extremely compelling.
[00:10:28.600 --> 00:10:37.240] Because, so I have this tiny granddaughter, she's actually not that tiny anymore because she's growing like crazy, and she's she's discovering her voice.
[00:10:37.240 --> 00:10:53.280] And of course, she's just mimicking what her parents and other people are doing, but she's a sponge, so she soaks up all the information around her, which of course comes from all the interactions with adults and others around her.
[00:10:53.920 --> 00:11:07.600] And the language is so, if you believe deacon, which as I said is very compelling, language is basically designed to do well in baby brains.
[00:11:07.600 --> 00:11:12.480] So that's why babies are so good at absorbing language.
[00:11:12.480 --> 00:11:27.680] You don't have to teach them, they just listen and they figure out from interactions: oh, okay, if someone is looking at that and they always use that word, then that word must mean that particular object that I always see when they mention it.
[00:11:27.680 --> 00:11:34.800] And it can be green or blue, or it can be big or small, but I'll give the example of the ball.
[00:11:35.120 --> 00:11:44.640] If it's round, you can pick it up, no matter what size it is, no matter what colour, no matter where it is, ball is linked to that particular object.
[00:11:44.640 --> 00:11:46.800] And that's how they learn.
[00:11:47.120 --> 00:11:52.080] And languages, of course, would go extinct if we wouldn't be speaking them.
[00:11:52.080 --> 00:11:55.040] So language is really akin to a virus.
[00:11:55.040 --> 00:12:02.000] If there's no brains to live in, and brains that transmit it to another brain, then languages would go extinct.
[00:12:02.000 --> 00:12:04.240] Because humans also depend on language.
[00:12:04.240 --> 00:12:15.120] I mean, we would still be alive if we wouldn't be able to speak, but we would certainly become a very different kind of species, I think.
[00:12:15.120 --> 00:12:20.480] So I really like his view of language as a virus.
[00:12:20.480 --> 00:12:38.200] And it completely fits with my idea that language is needed for babies, to raise babies, but of course, the babies also need the language because they have to be able to manipulate others around them to look after them for about 15 years because that's how long it takes them to become independent.
[00:12:38.200 --> 00:12:38.920] Yeah.
[00:12:39.160 --> 00:12:43.640] Since you mentioned the association between words and objects.
[00:12:43.880 --> 00:12:52.440] So go back a little bit in history to Skinner's theory of language acquisition and then what Chomsky did to challenge that.
[00:12:53.400 --> 00:13:10.760] Well Skinner saw language as sort of an association game, which in a way it is, because you, as I said, the example that I gave of the ball, if you always hear the word ball and it's a particular object that looks around and you can pick it up, then you start making that association.
[00:13:10.760 --> 00:13:14.680] But the association is, of course, not the same as syntax.
[00:13:14.680 --> 00:13:24.360] So Chomsky came up with this idea, which is based on the fact that children just effortless learn languages, no matter how many they're exposed to when they're really young.
[00:13:24.360 --> 00:13:30.520] They just pick them up, which I find is really mean because I'm learning Spanish at the moment and it's really hard.
[00:13:31.560 --> 00:13:42.440] So he came up with this innate language acquisition device, which is basically something I always see the skin as sort of a miracle.
[00:13:42.440 --> 00:13:51.160] So all of a sudden humans had this language acquisition device, which means that babies understand syntax.
[00:13:51.160 --> 00:13:57.560] So from that the idea became okay that there must be a language gene.
[00:13:57.560 --> 00:14:14.760] So there's a gene that makes us different from all our ancestors, from chimpanzees, and that gene allowed our babies to understand language with no particular learning skills or whatever.
[00:14:15.440 --> 00:14:25.040] Now, of course, people started looking for the language gene, and at some stage, people thought they found a language gene in the form of FOXP2.
[00:14:26.320 --> 00:14:34.960] Which is interesting because FOXP2 is a gene that is important for vocalization, but not just in humans.
[00:14:34.960 --> 00:14:41.360] In almost all vertebrates, FOXP2 is present and it's important in focalization.
[00:14:41.360 --> 00:14:48.240] If you knock it out in mice, the baby mice don't produce a sound anymore that attracts the mother.
[00:14:50.720 --> 00:15:04.320] And then another study found or claimed that there were signs or signature of positive selection on fox P2 in our lineage.
[00:15:04.320 --> 00:15:13.200] So they took that to mean that clearly there's been selection on FOXP2 in humans, in Homo sapiens only.
[00:15:13.200 --> 00:15:20.160] So that means it really is very strong evidence that that's a gene for language.
[00:15:20.160 --> 00:15:28.640] And the later study first found that the same FOXP2 gene can be found in Yanditals and Dennis Ophens.
[00:15:29.280 --> 00:15:36.720] And then a bigger study that took much, many more samples totally debunked the positive selection.
[00:15:36.720 --> 00:15:41.120] So they found no evidence of positive selection on FOXP2.
[00:15:41.120 --> 00:15:46.320] And I think that was the nail in the coffin of FOXP2 being the language gene.
[00:15:46.320 --> 00:15:49.520] So language gene just doesn't exist.
[00:15:49.520 --> 00:16:00.040] Oh, I forgot to mention that it all started with this particular family in the UK that had a very weird inability to speak, basically.
[00:15:59.760 --> 00:16:06.120] So they seemed to be of average intelligence, but they could not speak properly.
[00:16:06.760 --> 00:16:08.600] And they all had a mutation.
[00:16:08.600 --> 00:16:11.720] And that mutation was in a particular part of a chromosome.
[00:16:11.720 --> 00:16:12.840] I can't remember which one.
[00:16:12.840 --> 00:16:19.160] And then later, in that particular part of their chromosome, they found the FOXP2 gene.
[00:16:19.160 --> 00:16:20.920] So that's how it all started.
[00:16:20.920 --> 00:16:23.080] And of course, it was a brilliant story.
[00:16:23.080 --> 00:16:24.760] It just didn't hold up.
[00:16:24.760 --> 00:16:25.880] Well, there's no language.
[00:16:26.600 --> 00:16:34.920] Would it be correct to say that the FOXP2 gene is necessary but not sufficient to explain language acquisition?
[00:16:36.120 --> 00:16:39.960] It's necessary for vocalization, yes.
[00:16:39.960 --> 00:16:40.760] Oh, for vocalization.
[00:16:41.000 --> 00:16:42.360] But not for comprehension.
[00:16:42.360 --> 00:16:49.320] In other words, the members of this family could comprehend what somebody was saying, but they just couldn't answer.
[00:16:49.960 --> 00:16:51.240] I think so, yes.
[00:16:51.240 --> 00:16:58.680] But the other thing that they couldn't do, so apparently it would keep, and I found it in Pinker's book.
[00:16:59.160 --> 00:17:08.520] But little children, if you make up a verb, then they can infer what the past tense of that verb is because they just know the rules.
[00:17:08.840 --> 00:17:11.240] And these people couldn't do that either.
[00:17:11.240 --> 00:17:12.040] Oh, like.
[00:17:12.120 --> 00:17:13.400] So the comprehension.
[00:17:13.400 --> 00:17:13.720] Yeah.
[00:17:13.720 --> 00:17:14.680] Like add an ED.
[00:17:14.840 --> 00:17:15.320] They didn't have a.
[00:17:15.960 --> 00:17:17.160] They couldn't get that interesting.
[00:17:17.240 --> 00:17:23.000] I wonder if they could read, comprehend through reading, if not hearing.
[00:17:23.000 --> 00:17:26.840] Or if they could type out a response but not say it.
[00:17:28.120 --> 00:17:31.480] I'm not sure because writing is much more difficult than speaking, though.
[00:17:31.480 --> 00:17:34.840] Yeah, but writing is only about 5,000 years old.
[00:17:34.840 --> 00:17:35.720] Yes, that's right.
[00:17:35.720 --> 00:17:42.840] But I was just thinking of stroke patients who cannot speak, but they can type out what they're trying to say.
[00:17:42.840 --> 00:17:46.240] But that would be somebody who already has language acquisition.
[00:17:46.240 --> 00:17:46.720] Yeah.
[00:17:44.840 --> 00:17:50.400] But also to mention where in the brain the damage is.
[00:17:44.840 --> 00:17:51.760] I mean, that's the other interesting thing.
[00:17:52.000 --> 00:17:58.240] So, you know, people have also been looking for the part of your brain that's responsible for language.
[00:17:58.560 --> 00:18:10.880] But different kinds of damage have in your brain have different effects on your linguistic abilities, understanding, comprehension, speaking.
[00:18:11.520 --> 00:18:13.840] Oh, is that Broca's area?
[00:18:14.720 --> 00:18:15.040] I forget.
[00:18:15.120 --> 00:18:15.600] Well, that's one of the things.
[00:18:15.760 --> 00:18:18.640] I forget what that was for just for comprehension, maybe.
[00:18:18.960 --> 00:18:19.440] Yeah.
[00:18:19.760 --> 00:18:28.960] But now, this is a more recent study which looked at a geometry of the brain, which claims that there aren't.
[00:18:28.960 --> 00:18:40.640] Well, it's not saying that there aren't specific areas in the brain that do something for something specific, but that the connections within the brain are much more fluid.
[00:18:40.960 --> 00:18:49.440] It's more like waves, information waves going through the brain instead of particular parts of the brain being very essential for particular behaviors.
[00:18:49.440 --> 00:18:50.000] Yeah.
[00:18:50.000 --> 00:18:53.680] So your point is having the Fox P2 gene isn't enough.
[00:18:53.680 --> 00:19:01.760] So Neanderthals and Denisovans had them, had that, but we don't, that doesn't automatically mean they could speak.
[00:19:01.760 --> 00:19:05.920] Maybe they could, maybe they had language, but we just don't know that for sure, right?
[00:19:06.880 --> 00:19:11.440] No, the trouble is we never can, but I strongly suspect that they couldn't.
[00:19:11.440 --> 00:19:13.280] At least not the way we speak.
[00:19:13.280 --> 00:19:16.240] For the simple reason that they're, well, two reasons.
[00:19:16.240 --> 00:19:19.680] Their anatomy wasn't right, so their necks were too short.
[00:19:19.680 --> 00:19:24.960] Of course, I have to caveat that because we don't know anything about the Denisolvans.
[00:19:24.960 --> 00:19:26.160] You say Denisophants?
[00:19:26.200 --> 00:19:29.880] I think that's I think you're saying it the correct way.
[00:19:30.200 --> 00:19:30.920] Oh, good.
[00:19:30.920 --> 00:19:31.960] I'm glad.
[00:19:29.440 --> 00:19:36.520] So we don't know anything about them because we have a pinky and I think there's a molar.
[00:19:36.680 --> 00:19:40.840] So there's a lot of DNA, but no idea what they look like.
[00:19:40.840 --> 00:19:45.160] But Neanderthals seem to have had a neck that's too short.
[00:19:45.160 --> 00:19:57.880] Too short in the sense that the larynx would sit too low in the neck to be able to make the sounds that we can make.
[00:19:58.200 --> 00:20:42.520] Then there's this other more recent study that looks at epigenetic marks and they found that so if you look at the differences in the epigenome, so that's not the sequence of DNA, but the methylation pattern of the DNA, if you compare that between they used chimpanzees, Neanderthals, Denisulphans, and Homo sapiens, and then they found that the biggest difference, I think it was 53 differently methylated genes in humans that all have something to do with the larynx and the vocal cavity, also with the pelvis, but that's not interested for not interesting for language.
[00:20:42.760 --> 00:20:58.360] So those two pieces of information I think will make me believe, or believe is a bad word, but make me think that we are the only ones that have this precise form of communication.
[00:20:58.360 --> 00:21:18.720] That's not to say that Neanderthals and Denisulphans didn't have any communication, because I think all humans, you know, and now with humans, I mean all our ancestors when we split from the common ancestor with chimpanzees or bonobos, they all must have had some form of communication because all social animals have a form of communication.
[00:21:19.040 --> 00:21:28.720] I'm just arguing that we are the only ones with a sophisticated form of communication, which is due to many flukes of nature and the need to look after our babies.
[00:21:29.040 --> 00:21:29.840] Yeah.
[00:21:29.840 --> 00:21:42.160] So, yeah, maybe make a distinction between language and communication since you wrote about ants solving the traveling salesman problem of how to, you know, cover a certain amount of territory the most efficient way.
[00:21:42.160 --> 00:21:48.000] Of course, they don't have symbolic language or whatever, so communication can be done, but it lacks what?
[00:21:48.320 --> 00:21:54.320] Symbolic, you know, symbolism as a form of communication?
[00:21:55.280 --> 00:21:58.720] Well, I think you need symbolism before you can have a symbolic language.
[00:21:58.720 --> 00:22:00.480] And of course, our language is symbolic.
[00:22:00.480 --> 00:22:07.760] I mean, the other symbolic language which is quite different is the bees dance language.
[00:22:07.760 --> 00:22:14.320] So they code distance and profitability in the way they dance.
[00:22:15.440 --> 00:22:20.320] If the source that they're dancing for is a long way away, they dance for longer.
[00:22:20.320 --> 00:22:35.760] And the direction in which you have to fly to get there is determined by, or is symbolized in the angle relative to the sun, and how great the food source is that they've found is reflected in how enthusiastic they dance.
[00:22:35.760 --> 00:22:41.520] So that's a symbolic language, but it only has three bits of information.
[00:22:42.720 --> 00:22:47.200] Our language is special.
[00:22:47.200 --> 00:22:51.200] And of course, you know, cetaceans, they have forms of communication.
[00:22:51.200 --> 00:22:59.040] I was actually reading or hearing falls here, and for me, that means comfort meals, cozy nights, and tail-gating weekends.
[00:22:59.040 --> 00:23:01.800] And Omaha Steaks makes it all easy.
[00:22:59.840 --> 00:23:05.960] I love having their premium steaks and juicy burgers ready in my freezer.
[00:23:06.280 --> 00:23:13.000] I recently grilled their filet mignon, so tender, flavorful, and better than anything I've had elsewhere.
[00:23:13.000 --> 00:23:21.560] Right now, during their red-hot sale, you get 50% off site-wide, plus get an extra $35 off with code flavor at checkout.
[00:23:21.560 --> 00:23:24.680] Get fired up for fall grilling with Omaha Steaks.
[00:23:24.680 --> 00:23:30.520] Visit Omahasteaks.com for 50% off site-wide during their red-hot sale event.
[00:23:30.520 --> 00:23:35.080] And for an extra $35 off, use promo code flavor at checkout.
[00:23:35.080 --> 00:23:42.280] That's 50% off at Omahasteaks.com and an extra $35 off with promo code flavor at checkout.
[00:23:42.280 --> 00:23:43.800] See site for details.
[00:23:43.800 --> 00:23:52.760] Hearing, I can't quite remember that dolphins give themselves a particular name, so it's a particular sound that they only use to refer to themselves.
[00:23:52.760 --> 00:23:55.800] That, of course, is also symbolic.
[00:23:56.120 --> 00:23:59.000] But our language is infinite.
[00:23:59.000 --> 00:24:05.000] So we can use words in different syntax basically to mean different things.
[00:24:05.000 --> 00:24:14.440] So we can basically say whatever we want, and we can talk about the past, we can talk about what's happening now, we can talk about the future, we can talk about things that don't exist.
[00:24:14.440 --> 00:24:18.040] I mean, that's all Harari's argument.
[00:24:18.280 --> 00:24:22.680] You know, we talk about things that don't exist and pretend that they do exist.
[00:24:22.680 --> 00:24:31.320] So that's what makes our language, our form of communication, special or unique and special from everything else.
[00:24:31.560 --> 00:24:33.720] But bacteria, they communicate.
[00:24:33.720 --> 00:24:34.520] Yes, right.
[00:24:34.520 --> 00:24:34.920] Yeah.
[00:24:35.240 --> 00:24:42.520] Yeah, it does feel like there's a big gap there that sounds where explanations sound like, and then a miracle happened.
[00:24:42.760 --> 00:24:44.200] You know, and then we have this thing.
[00:24:44.200 --> 00:24:51.520] So, I mean, you're point that we still have to find a gradual evolutionary sequence of how this came about and why.
[00:24:51.840 --> 00:24:56.960] So, maybe speak for just a moment back to general how natural selection works.
[00:24:57.120 --> 00:25:01.360] Why do you think it's natural selection and not sexual selection, or do you think it's both?
[00:25:01.360 --> 00:25:06.000] And is language an adaptation or more of an exaptation spandrel?
[00:25:06.320 --> 00:25:10.640] Why would you think it could be either one of those?
[00:25:10.640 --> 00:25:17.440] Maybe just kind of give that general how evolution works with the context of language development.
[00:25:17.760 --> 00:25:38.800] Well, of course, in general, how evolution works, if you have, if you as an individual have some sort of trait that gives you a leg up in competition with others in your population, leg up in a sense that you can produce more offspring or you can produce them more efficiently, then that will be selected for if it is heritable.
[00:25:38.800 --> 00:26:01.600] So, my argument is that because we had all these genetic and morphological flukes in our evolutionary history that I describe in the book, at some stage we had this really strange fluke in the sense that we had a pseudogene that's still a pseudogene in gorillas and chimpanzees, got repaired, duplicated.
[00:26:01.600 --> 00:26:11.920] We have all these inversions and chromosomes, which are weird genetic aberrations, and that basically gave us more neurons.
[00:26:11.920 --> 00:26:14.960] So, our head started to balloon.
[00:26:15.280 --> 00:26:22.560] If your head starts to balloon, you have to come up with a solution because, you know, your skull needs to adapt.
[00:26:22.560 --> 00:26:30.280] The skull is a very malleable structure because it's comprised of all these different components, it sort of can move.
[00:26:29.440 --> 00:26:34.200] So the skull could adapt to this expanding brain.
[00:26:34.520 --> 00:26:47.000] So just reiterate again, the brain expanded because of genetic fluke, a duplication or triplication of a particular gene that prevented the stem cells of neurons to keep dividing for longer.
[00:26:47.000 --> 00:26:53.080] So you get more neurons, the neurons migrate to the cortex, the cortex expands.
[00:26:53.400 --> 00:26:59.480] The skull adapted to this increase in brain mass to basically make it fit.
[00:26:59.480 --> 00:27:03.880] And that then led to a lengthening of the throat, a descent of the larynx.
[00:27:04.040 --> 00:27:18.360] All of a sudden, in a species that's already highly social, we started to be able to make more precise sounds just because of these changes in our morphology.
[00:27:18.680 --> 00:27:22.040] Now, so these are all flukes in a way.
[00:27:22.040 --> 00:27:25.160] But precise communication was an adaptation.
[00:27:25.160 --> 00:27:27.160] So I don't think it was an acceptation.
[00:27:27.160 --> 00:27:36.280] I think it was an adaptation because this now made it possible to be a bit more precise in the way you communicate.
[00:27:36.920 --> 00:27:43.400] If you were more precise in the way that you communicate, you can elicit more help.
[00:27:43.720 --> 00:28:04.680] Little babies that were now born even more prematurely because of this big brain, they could no longer stay in the mother because otherwise either they wouldn't be able to fit through the pelvis, through the birth canal, but also because their brain was so large, it's metabolically extremely expensive.
[00:28:04.680 --> 00:28:10.360] So, for the mother to increase the length of the pregnancy, it's just metabolically not possible.
[00:28:10.360 --> 00:28:14.280] So, two reasons why those babies needed to be born very prematurely.
[00:28:14.280 --> 00:28:36.560] You now have a very premature brain in a little baby that's outside of the womb, surrounded by other individuals, because our ancestors were already social because they decided to become bipedal in an environment which is extremely dangerous because of all the predators that can run much faster.
[00:28:36.560 --> 00:28:50.320] So, my argument is from Lucy onwards, so that's Astrolopithecus afarensis, 4.4 million years ago or so, our ancestors already started to live in family groups.
[00:28:50.320 --> 00:28:53.520] And these family bonds just became stronger and stronger.
[00:28:53.520 --> 00:29:01.120] So, we already have this very strong social kin group structure, which requires communication.
[00:29:01.120 --> 00:29:12.400] As I said earlier, all social animals need some form of communication, but now through all these flukes of nature, we had a means to be a little bit more precise in our communication.
[00:29:12.400 --> 00:29:31.360] The babies, because they're so neuroplastic, pick up all those social cues, and also for them to be able to elicit more help by some form of more precise communication gives them a leg up because now they get more resources relative to other babies.
[00:29:31.360 --> 00:29:50.640] So, you get this positive selection, I think, also as high or positive feedback, I should say, between just being a little bit better at communication, making it a little bit easier to raise those dependent babies, which then selects for getting better still.
[00:29:50.640 --> 00:30:03.560] And you basically have runaway selection between the need to look after those extremely needy babies and a form of communication that can be extremely precise.
[00:30:04.200 --> 00:30:25.240] Once, of course, you have something like language, then you can get all the other things, then you can have sexual selection come and play a role, the mating mind like Miller, or all the fictions that Harare talks about that makes human societies even stronger because of these shared beliefs.
[00:30:25.560 --> 00:30:34.680] But I still think if you drill down to the essence of Darwinian natural selection, it has to have something to do with giving you a reproductive advantage.
[00:30:34.680 --> 00:30:40.040] So that's why I think language is an adaptation and not an exaptation.
[00:30:40.040 --> 00:30:44.520] And it has everything to do with increasing reproductive output.
[00:30:45.800 --> 00:30:46.600] Perfectly said.
[00:30:46.600 --> 00:30:47.960] Wow, incredible.
[00:30:47.960 --> 00:30:49.560] You're so articulate on this.
[00:30:49.560 --> 00:30:51.000] Of course, you're the author of the book.
[00:30:51.000 --> 00:30:52.680] You wrote about Alfred Russell Wallace.
[00:30:52.840 --> 00:30:55.000] You know, I wrote a biography of him.
[00:30:55.000 --> 00:30:58.040] That was my doctoral dissertation in the history of science.
[00:30:58.440 --> 00:31:12.920] He was kind of a hyper-adaptationist, such that if he could not think of what the adaptive purpose would be of having a brain, you know, four times the size that we really need it, chimps do just fine with a much smaller brain.
[00:31:12.920 --> 00:31:14.280] What do we need that for?
[00:31:14.280 --> 00:31:23.240] Why do you need to be able to abstractly reason to be able to do calculus, for example, or aesthetic appreciation like music and so on?
[00:31:23.240 --> 00:31:33.880] Not being able to think of how that could be adaptive to lead more offspring and so on, then he invoked, well, there must be some higher power that kind of plopped that in there.
[00:31:34.120 --> 00:31:44.520] So that's one of the dangers of hyper-adaptationism, which is why Steve Gould always kind of pushed back against that, because some people pushed, like Wallace pushed it too far.
[00:31:44.960 --> 00:31:54.160] But in terms of a Darwinian explanation, then let's talk about the great apes and their language as something of a bridge, maybe.
[00:31:54.160 --> 00:31:57.440] I think what you're arguing goes beyond what they're able to do.
[00:31:57.440 --> 00:32:04.320] And they did not have those quirky little contingent events that happened in their skulls and so forth.
[00:32:04.320 --> 00:32:07.680] And yet they still have some symbolic communication, right?
[00:32:07.680 --> 00:32:15.440] So you wrote about Kanzi, the bonobo, and gorillas like Coco and Nimchimsky.
[00:32:16.240 --> 00:32:17.440] I saw that documentary.
[00:32:17.440 --> 00:32:21.040] That was a pretty depressing, sad, tragic story.
[00:32:21.040 --> 00:32:22.880] It was very sad, actually.
[00:32:23.440 --> 00:32:28.480] What was the name of the famous psychologist that ran that experiment?
[00:32:28.480 --> 00:32:29.840] I'm trying to remember his name.
[00:32:29.840 --> 00:32:31.040] It was infuriating.
[00:32:31.760 --> 00:32:33.200] Sharon's something rather, I think.
[00:32:33.520 --> 00:32:40.480] Yeah, but that was the 1960s when there was less concern about animal cruelty and rights and so forth.
[00:32:41.520 --> 00:32:44.400] But, you know, we've kind of at Skeptic, we've sort of followed the debate.
[00:32:44.400 --> 00:32:47.840] You know, is this a form of language?
[00:32:47.840 --> 00:33:12.320] You know, where you see these videos, like Sue Savage Ramba's videos with Kanzi, you know, they're impressive, but then there are critics that say, well, but there could be more of a clever Hans effect where the chimp or the gorilla is reading the body language of the controller or the, you know, their caretaker, and they're kind of doing a more of a Skinnerian association.
[00:33:12.320 --> 00:33:14.400] In other words, the lights aren't on upstairs.
[00:33:14.400 --> 00:33:20.480] They're just kind of processing it in some more Skinnerian stimulus response way.
[00:33:20.800 --> 00:33:33.880] But you made the case in your book that Kanzi probably does have more comprehension of language and words than just what a Skinnerian association model would provide.
[00:33:35.160 --> 00:33:39.720] Well, body language is the start to any form of communication, I think.
[00:33:40.680 --> 00:33:44.600] So I wouldn't say that that is something completely different.
[00:33:44.600 --> 00:33:46.920] I mean, we all use body language.
[00:33:46.920 --> 00:33:54.360] Even now, you know, we have the sophisticated form of communication, but we still really rely on body language.
[00:33:54.520 --> 00:34:09.800] So I think that's one of the first, well, essential, basically, you need to be able to read someone else's body language before you can have any other form of communication.
[00:34:09.800 --> 00:34:16.280] Now, someone did a, I really love those videos, by the way, of Serbin and Kansy.
[00:34:18.920 --> 00:34:22.040] So someone did an analysis.
[00:34:22.040 --> 00:34:29.240] So he went through all the videos that are available online from all the experiments that she did, if you call them experiments.
[00:34:29.240 --> 00:34:33.320] It's more, you know, living with Kenzie in a way.
[00:34:33.960 --> 00:34:44.760] And Stephen Pinker once made, used the example in his book, Dog Bites Man, will not get into the newspaper, but man bites dog probably will.
[00:34:44.760 --> 00:34:49.720] So you have the same words, different syntax, completely different meaning.
[00:34:49.720 --> 00:35:04.280] So he looked at all the videos of Cansey and all the phrases and sentences that the researchers used and came to the conclusion.
[00:35:04.280 --> 00:35:09.560] It's been a while since I read that paper, so forgive me, I don't know, I can't remember all the details.
[00:35:09.560 --> 00:35:23.040] But he came to the conclusion that it's Kenzie does comprehend language because he understands that words in a different order mean something different.
[00:35:23.040 --> 00:35:35.920] So then, if that is true, then you have to appreciate that he does understand language as such and not just use the Skinnerian associations.
[00:35:36.880 --> 00:35:48.400] But still, quite a quantitative difference between what Kanzi and Coco could do in NIM versus what humans could do.
[00:35:48.400 --> 00:35:52.160] Is it quantitatively different enough to be qualitatively different?
[00:35:52.160 --> 00:35:56.320] In other words, Kanzi can't tell me that his father was poor but honest, right?
[00:35:56.640 --> 00:35:58.240] Or something like that.
[00:35:58.240 --> 00:35:58.800] No.
[00:36:00.080 --> 00:36:00.880] No, definitely.
[00:36:00.880 --> 00:36:03.200] I mean, there's a difference in brain size too.
[00:36:03.200 --> 00:36:03.760] Yeah.
[00:36:03.760 --> 00:36:05.280] And there's lots of differences.
[00:36:05.280 --> 00:36:13.440] So I'm not saying that Kenzie would be able to speak like you and I are speaking now if he just had the different morphology.
[00:36:13.760 --> 00:36:15.520] But the two things go together again.
[00:36:15.520 --> 00:36:23.600] I mean, Darwin himself made that remark that other primates probably could.
[00:36:23.920 --> 00:36:37.920] So their focal cords, I think he made the argument that there's no morphological reason why they wouldn't be able to speak, but it's the lack of brain size that prevented any further increase in their ability to communicate.
[00:36:38.240 --> 00:36:40.800] And I think he was dead right.
[00:36:41.120 --> 00:36:52.320] So they don't have the morphology to be able to make the precise signs that we are making, which then, of course, means that there was no positive feedback again on the brain either.
[00:36:52.320 --> 00:36:56.640] But also, they didn't have this weird fluke that made their brain balloon.
[00:36:56.640 --> 00:37:01.080] So, that's why the story is not a linear story.
[00:36:59.440 --> 00:37:09.400] So, people like thinking in linearities, but it's not, there's all these things that go together that happened sort of at the same time.
[00:37:09.400 --> 00:37:17.480] And, in my view, every time one of those weird flukes happened in our evolutionary history, natural selection just had to make do.
[00:37:17.480 --> 00:37:28.440] Either I'm going to let this weird species go extinct because now it's got this strange fluke and I don't know what to do with it, or okay, we'll find a solution.
[00:37:28.440 --> 00:37:36.120] And natural selection, every time in our evolutionary history, find a solution that just, okay, we can cope with this weird fluke now.
[00:37:36.120 --> 00:37:38.280] Okay, they're now walking on two legs.
[00:37:38.280 --> 00:37:49.080] Well, then they basically have to stick together because otherwise they will be predated on because the poor buggers can't even run fast enough because they started walking on two legs.
[00:37:49.080 --> 00:37:57.640] The pelvis had to change even more to make it to increase the balance.
[00:37:57.640 --> 00:38:02.680] That then led them, Homo erectus, to be able to run long distances.
[00:38:02.680 --> 00:38:06.040] If you run long distances, you have to be able to sweat, etc.
[00:38:06.120 --> 00:38:20.440] So, every time there was some weird event in our evolutionary history, natural selection allowed our species, our ancestors, to cope with it until the next one came.
[00:38:20.440 --> 00:38:22.680] So, that's how I sort of see it.
[00:38:22.680 --> 00:38:38.520] There's all these weird things, weird things just happen, and then either it's selected against, or it doesn't necessarily have to be selected for, it just has to be good enough to be able to stay alive and reproduce.
[00:38:38.920 --> 00:38:46.400] People tend to think that natural selection drives species towards something, but of course it doesn't.
[00:38:46.400 --> 00:38:48.800] Either you stay alive or you don't.
[00:38:44.840 --> 00:38:50.560] Those are the two options.
[00:38:50.880 --> 00:38:53.360] Yeah, there's no directionality in evolution.
[00:38:53.360 --> 00:38:58.240] It seems like it in hindsight, and we put ourselves at the pinnacle.
[00:38:58.240 --> 00:39:02.400] But, you know, if big brains were so great, how come almost no species have them?
[00:39:02.400 --> 00:39:03.360] They do just fine.
[00:39:03.360 --> 00:39:04.560] This is Wallace's point.
[00:39:04.560 --> 00:39:07.920] Most species do just fine with tiny little brains.
[00:39:08.560 --> 00:39:13.200] Well, most species can't have a large brain because they're so expensive.
[00:39:13.200 --> 00:39:13.680] Yeah.
[00:39:14.000 --> 00:39:36.160] So one reason, again, a very compelling argument for why the megafauna went extinct as soon as humans made an appearance is because these species were so huge, it takes a very long time to mature, which is fine if you don't have the risk of being killed off before you start reproducing.
[00:39:36.160 --> 00:39:41.360] So in the absence of predators, species can grow to enormous size.
[00:39:41.680 --> 00:39:44.560] Humans came along, they started picking them off.
[00:39:45.200 --> 00:39:50.080] It's not even necessary to assume that humans deliberately targeted the megafauna.
[00:39:50.080 --> 00:39:56.720] It's just megafauna, you know, you can't ignore a woolly mammoth, for example, because it's so big.
[00:39:56.720 --> 00:40:09.280] So they started killing them, which then selected against these very large animals because many of them would be killed before they're old enough, mature enough to reproduce.
[00:40:09.280 --> 00:40:11.360] The same is true of the brain.
[00:40:11.360 --> 00:40:17.360] So brain size is limited by body size.
[00:40:17.680 --> 00:40:25.200] If you want to grow a very large brain, sorry, not body size, it just takes a very long time to grow a large brain.
[00:40:25.200 --> 00:40:44.680] So, if you want, if you want, in the evolutionary sense, if you want to grow a large brain, you have to mature very slowly just because it takes how long to grow that brain, which makes you vulnerable, just like a megafauna, to die before you're mature enough to produce offspring.
[00:40:44.680 --> 00:40:47.160] So, it's a very tricky strategy.
[00:40:47.160 --> 00:41:06.600] So, only so I use this example of a meta-analysis that's been done on birds, where they basically try to figure out what limits brain size in birds, because there are many very clever birds and they have a large brain.
[00:41:06.600 --> 00:41:24.280] Every single one of them has very extended parental care, which means that the parents, while the baby bird or the young bird is growing its brain, the brain isn't sophisticated enough to be able for the bird to look after itself.
[00:41:24.280 --> 00:41:32.120] So, it can only get away with having this slow maturing large brain if it has other individuals looking after it.
[00:41:32.120 --> 00:41:34.200] And the same is true in our case.
[00:41:34.200 --> 00:41:44.040] Our babies could not get away with being born so totally helpless if there weren't all these other individuals around them to help raise them.
[00:41:44.200 --> 00:41:45.560] So, there's a trade-off.
[00:41:45.560 --> 00:41:54.440] So, most species just can't have a large enough brain because they don't have all the other conditions around it.
[00:41:54.440 --> 00:41:57.400] And that's called the grey ceiling hypothesis.
[00:41:57.400 --> 00:42:09.160] So, there's a particular brain size beyond which it's very difficult to, well, you basically can't get through this grey ceiling unless you're a hypersocial species.
[00:42:09.160 --> 00:42:11.800] And that's also where grandmothers come in.
[00:42:11.800 --> 00:42:20.720] So, the menopause or the grandmother hypothesis: why do human females stop reproducing or stop, yeah, well, yes, stop reproducing.
[00:42:14.680 --> 00:42:21.280] Two reasons.
[00:42:21.520 --> 00:42:30.000] One's if one, if it takes so long for your offspring to mature, if you reproduce at a every business has an ambition.
[00:42:30.000 --> 00:42:39.440] PayPal Open is the platform designed to help you grow into yours with business loans so you can expand and access to hundreds of millions of PayPal customers worldwide.
[00:42:39.440 --> 00:42:47.840] And your customers can pay all the ways they want with PayPal, Venmo, Pay Later, and all major cards so you can focus on scaling up.
[00:42:47.840 --> 00:42:52.720] When it's time to get growing, there's one platform for all business: PayPal Open.
[00:42:52.720 --> 00:42:55.280] Grow today at paypalopen.com.
[00:42:55.280 --> 00:42:57.920] Loan subject to approval in available locations.
[00:42:58.080 --> 00:43:03.040] Old age, then you're very likely that you die before your child is independent.
[00:43:03.040 --> 00:43:11.920] And then it's much better evolutionary scene to help raise your children, raise their children.
[00:43:11.920 --> 00:43:14.800] In other words, raise your grandchildren.
[00:43:15.120 --> 00:43:19.840] So grandmothers in particular, people don't tend to talk about grandfathers so much.
[00:43:19.840 --> 00:43:20.240] Yeah, hey.
[00:43:21.600 --> 00:43:22.960] I know, I know, I know.
[00:43:25.040 --> 00:43:26.320] Well, there are two reasons.
[00:43:26.320 --> 00:43:36.800] One is that grandfather or fathers in particular, in general, I should say, are never 100% sure that the children they're raising are theirs genetically.
[00:43:37.360 --> 00:43:45.920] So there's paternity uncertainty, which leads to a lot of very interesting behaviors in many animals, including humans, actually.
[00:43:46.480 --> 00:43:47.520] So that's one.
[00:43:47.520 --> 00:43:51.040] But also, for males, there is no incentive.
[00:43:51.040 --> 00:43:56.400] There's no reason why they should not stop reproducing because they can always find a younger female.
[00:43:56.400 --> 00:44:07.080] So those two reasons sort of not select against, but they don't make grandfathering as profitable in a kin selection sense as grandmothering is.
[00:44:07.400 --> 00:44:07.720] Yeah.
[00:44:08.520 --> 00:44:09.880] I have nothing against grandfathering.
[00:44:10.040 --> 00:44:12.840] Mama's baby, daddy's maybe, right?
[00:44:13.160 --> 00:44:14.760] Exactly, exactly.
[00:44:15.880 --> 00:44:20.520] Yeah, well, that explanation also goes a long way to explaining why we die.
[00:44:20.520 --> 00:44:22.680] You know, why can't we just live forever?
[00:44:23.240 --> 00:44:33.080] And the answer seems to be: well, once you've become a parent and then a grandparent, natural selection has pretty much done its job of getting your genes into the next generation.
[00:44:33.080 --> 00:44:44.040] And there's just no point in putting all that extra energy into keeping your body alive for 200 or 300 years when your genes are already into the future.
[00:44:45.320 --> 00:44:46.120] That's correct.
[00:44:46.120 --> 00:44:46.360] Yeah.
[00:44:46.360 --> 00:44:49.240] So that's the soma, what is it?
[00:44:49.240 --> 00:44:52.360] The soma death theory or whatever it's called.
[00:44:52.360 --> 00:44:52.760] Yeah.
[00:44:53.480 --> 00:44:53.800] Yes.
[00:44:53.800 --> 00:44:55.480] The disposable soma theory.
[00:44:55.480 --> 00:44:55.960] That's right.
[00:44:55.960 --> 00:44:56.360] That's right.
[00:44:56.360 --> 00:44:56.840] Yeah, that's right.
[00:44:57.320 --> 00:45:00.680] So keep you alive until you've reproduced and then you're done.
[00:45:00.680 --> 00:45:10.040] But of course, that changed in our lineage again, especially for females, because they live way beyond their reproductive age.
[00:45:11.560 --> 00:45:15.320] Of course, it doesn't mean that we can now live up to 200 years.
[00:45:15.320 --> 00:45:23.880] I know some people are convinced that they can, and they do all these weird experiments on themselves to see if they can get skeptic.
[00:45:23.960 --> 00:45:28.520] It's not going to happen unless there's some miracle breakthrough, technologically speaking.
[00:45:28.520 --> 00:45:34.440] Really, what's happening is modern medicine is getting more and more people up to the upper ceiling.
[00:45:34.600 --> 00:45:38.440] But no one's going past, say, 115 or 120.
[00:45:38.840 --> 00:45:41.080] Again, unless they're 122, apparently.
[00:45:41.080 --> 00:45:42.200] I was reading that the other night.
[00:45:42.200 --> 00:45:43.080] Yeah, maybe.
[00:45:43.040 --> 00:45:49.840] You know, some of these really old people, the reliability of the report, is there a birth certificate?
[00:45:44.840 --> 00:45:50.960] You know, we need to see that.
[00:45:51.280 --> 00:45:51.840] Right?
[00:45:51.840 --> 00:45:52.640] No, that's fair enough.
[00:45:52.640 --> 00:45:56.800] You know, about the blue zones where people supposedly live longer around the world.
[00:45:57.520 --> 00:46:04.400] But all that data now is called into question because it was all self-report data of how old they actually were.
[00:46:05.040 --> 00:46:06.400] Yeah, that's fair enough.
[00:46:06.400 --> 00:46:14.080] I did read, though, so the 122-year-old woman, she quit smoking at 117.
[00:46:14.080 --> 00:46:15.120] That's right.
[00:46:16.160 --> 00:46:20.640] Yeah, so confounding variables are a complicating factor for scientists.
[00:46:20.640 --> 00:46:20.960] Yeah.
[00:46:20.960 --> 00:46:22.480] How could that be?
[00:46:22.480 --> 00:46:26.800] And the guy that led the perfectly healthy life dropped dead at 35, right?
[00:46:27.120 --> 00:46:27.760] I know.
[00:46:28.000 --> 00:46:34.800] You know that guy that, if you like Netflix, that Brian Johnson, the tech billionaire, is trying to live, well, forever.
[00:46:35.040 --> 00:46:42.800] And he's got all the money and resources and help and technology to test his blood every day.
[00:46:42.800 --> 00:46:45.040] And he takes all these supplements.
[00:46:45.040 --> 00:46:48.480] He's even more fanatic than Ray Kurzweil about this.
[00:46:48.480 --> 00:46:54.320] But if you watch the Netflix, I don't know if it's Netflix, but anyway, there's a documentary about him where they follow him around.
[00:46:54.320 --> 00:46:56.240] To me, he doesn't look healthy.
[00:46:56.480 --> 00:47:01.600] I mean, he does all this stuff that's supposedly healthy, but he never goes outside because he doesn't want to get any sun.
[00:47:01.600 --> 00:47:02.240] Okay.
[00:47:02.560 --> 00:47:04.000] So he looks kind of pale.
[00:47:04.000 --> 00:47:10.320] And then he has dinner at 11:30 in the morning, and he goes to bed by 8:15 at the latest.
[00:47:10.320 --> 00:47:12.640] And the filmmaker is like, How's your life?
[00:47:12.640 --> 00:47:13.440] He's a single guy.
[00:47:13.440 --> 00:47:14.240] How's your love life?
[00:47:14.240 --> 00:47:16.080] He goes, Not very good.
[00:47:17.360 --> 00:47:17.840] I know.
[00:47:18.000 --> 00:47:25.440] I heard a podcast with him, and I thought, well, if that's, if you have to live a long time like that, then I prefer to die younger.
[00:47:25.680 --> 00:47:26.640] Doesn't sound like living.
[00:47:26.720 --> 00:47:27.760] I'd have some fun.
[00:47:27.760 --> 00:47:28.240] Yeah, yeah.
[00:47:28.960 --> 00:47:29.280] All right.
[00:47:29.440 --> 00:47:30.440] Yeah, the things people do.
[00:47:30.680 --> 00:47:44.440] Some of the other things that you talk about in the book that I enjoyed, Paul Ekman's research on reading emotions in people's faces and body language, and then Trivor's deceit and self-deception.
[00:47:44.440 --> 00:47:52.840] You know, where we evolved a reasonable capacity to detect when somebody else is trying to deceive us, but it's not perfect.
[00:47:52.840 --> 00:48:04.680] And if the person believes the lie, you know, believes what they're saying, then their body won't give off the cues that they're lying, which, you know, Trivor's argues is a heavier cognitive load.
[00:48:04.680 --> 00:48:11.960] So you have to keep track of all the stuff, and that gets reflected in your micro-expressions in your face, I guess, and whatever.
[00:48:12.280 --> 00:48:14.440] So I thought that was pretty interesting.
[00:48:15.400 --> 00:48:23.720] Well, so I don't know if you remember when you came to Australia, but we have very strict rules in what you are allowed to bring in.
[00:48:23.720 --> 00:48:29.000] So if you have anything made of timber, you're not allowed to declare it.
[00:48:29.000 --> 00:48:32.520] You're not allowed to bring in fruit, vegetables, etc., etc.
[00:48:33.160 --> 00:48:38.680] And at the time, I was still sort of moving from the Netherlands to Australia.
[00:48:38.680 --> 00:48:45.080] So I had packed some wooden things in the Netherlands, put them in my bag, completely forgot about them.
[00:48:45.080 --> 00:48:48.840] So on the plane, when you're about to land in Australia, you have to tick all the books.
[00:48:48.840 --> 00:48:51.320] No, I don't have any wooden items, etc.
[00:48:52.200 --> 00:48:56.440] I landed in Sydney and they had they x-rayed my bag.
[00:48:56.440 --> 00:48:58.920] Oh, no, there's always ask you questions.
[00:48:58.920 --> 00:49:01.000] You know, do you have anything that you need to declare?
[00:49:01.000 --> 00:49:07.400] And I said, no, I don't have anything to declare because I honestly totally forgot about these wooden things that I put in my bag.
[00:49:07.720 --> 00:49:09.640] And so I said, no, I've got nothing.
[00:49:09.640 --> 00:49:12.520] And then they said, okay, you can just go through.
[00:49:12.520 --> 00:49:16.800] At home, I opened my bag, and the first thing I saw were those wooden items.
[00:49:16.800 --> 00:49:28.080] And I thought, Oh, dear, if they would have x-rayed my bag and I would have and I had said, No, I don't have anything made of wood, I would have been in serious trouble because I would have thought that I lied.
[00:49:28.080 --> 00:49:31.040] But I wasn't lying because I completely forgotten.
[00:49:31.360 --> 00:49:34.000] So, clearly, my body language is very clear.
[00:49:34.160 --> 00:49:36.800] No, she doesn't have anything that she needs to declare.
[00:49:36.800 --> 00:49:41.440] So, I totally, I totally believe this self-deception.
[00:49:41.760 --> 00:49:49.600] And I'm sure you've had arguments with people who claim that they were certain that they were right about something, even though you know that they were wrong.
[00:49:49.600 --> 00:49:53.680] But they're just so convinced that there's no way you can convince them that they're wrong.
[00:49:53.680 --> 00:50:20.800] So, I think self-deception makes complete sense, but at the same time, it's also very scary because I seem to remember that he used examples of airline pilots who made terrible mistakes but were certain that they were right, really believed that they could do something or they basically convinced themselves that they were much better, which I think is often something you find in young men too.
[00:50:20.800 --> 00:50:29.120] One of the reasons why young men have a higher mortality than young females that believe too much in themselves.
[00:50:29.120 --> 00:50:35.600] Competing for the Darwin Award picking themselves out of the gene pool early before returning to them.
[00:50:35.680 --> 00:50:37.280] Is that still being handed out, by the way?
[00:50:37.520 --> 00:50:38.880] I haven't seen it in a few years.
[00:50:38.880 --> 00:50:40.880] I'm not sure that they do that anymore.
[00:50:41.040 --> 00:50:42.960] Because I'm sure there are many contestants.
[00:50:43.360 --> 00:50:54.000] Oh my god, well, you know, Twitter has endless feeds of videos of guys like, hold my beer while I jump off the roof and see if I can make it to the pool.
[00:50:54.320 --> 00:50:59.440] And there's just hundreds of them, just crazy stuff that people, mostly guys, right?
[00:50:59.440 --> 00:51:02.600] Probably, and this, you know, Jeffrey Miller would probably have something to say about this, right?
[00:51:02.840 --> 00:51:08.440] Status seeking amongst other guys, trying to impress women, you know, that sort of thing.
[00:51:08.440 --> 00:51:15.720] And the idea being, my genes are so good, I can do this crazy-ass, completely high-risk thing and still survive.
[00:51:15.720 --> 00:51:18.200] So that's how good I am.
[00:51:18.200 --> 00:51:19.080] Until they're not.
[00:51:19.240 --> 00:51:20.120] Until they're not, right.
[00:51:20.120 --> 00:51:21.800] Until they get taken out of the gene pool earlier.
[00:51:21.880 --> 00:51:22.680] But not all of them.
[00:51:22.680 --> 00:51:25.240] So some of that still exists in our gene pool.
[00:51:25.560 --> 00:51:25.880] I know.
[00:51:25.880 --> 00:51:34.840] I was going to ask you: how old would we be if your half of the species had hips big enough and a pelvic opening big enough to have a big brain baby at?
[00:51:34.840 --> 00:51:36.840] I don't know, what, 18 months or 20 months?
[00:51:37.080 --> 00:51:41.240] What would our natural birth gestation period be?
[00:51:41.400 --> 00:51:43.080] I think about twice as long.
[00:51:43.080 --> 00:51:45.320] Yeah, so 18 months.
[00:51:45.320 --> 00:51:45.720] Yeah.
[00:51:45.720 --> 00:51:53.640] Yeah, but it just doesn't work because the mother, the pregnant woman, can't, she just can't produce the energy.
[00:51:53.640 --> 00:51:54.680] Right, right.
[00:51:55.000 --> 00:51:58.600] Oh, so it's not just the pelvic opening, it's everything.
[00:51:58.600 --> 00:51:59.480] It's both.
[00:51:59.480 --> 00:52:00.200] Okay, all right.
[00:52:00.520 --> 00:52:08.200] It's just because the brain is so expensive, she will have to produce so much more energy to maintain herself and to grow this even bigger brain.
[00:52:08.200 --> 00:52:09.000] Yeah, yeah.
[00:52:09.320 --> 00:52:14.520] I wanted to read a little section from your book because I thought this was so interesting on how scientists work here.
[00:52:14.520 --> 00:52:22.360] You're writing about this ossuary collection of bones in Hallstatt, Austria, called the Hallstatt-Beinhaas.
[00:52:22.360 --> 00:52:24.840] Beinhaas is German for house of bones.
[00:52:24.840 --> 00:52:31.560] It's an idyllic village positioned on the shore of Lake Hallstadt, surrounded by the mountains.
[00:52:32.120 --> 00:52:36.280] In the 1700s, the local church started a less than idyllic tradition.
[00:52:36.280 --> 00:52:40.760] They started to dig up corpses of people who died in the preceding 10 to 15 years.
[00:52:40.760 --> 00:52:42.600] They were running out of space.
[00:52:42.600 --> 00:52:50.400] And so family members of the deceased were asked to stack up their loved ones' bones inside the house, now known as the Hallstatt Beinhaus.
[00:52:50.560 --> 00:52:56.400] The bones of relatives would be placed next to kin, thereby starting a genealogical record.
[00:52:56.400 --> 00:53:03.520] From 1720 onward, the skulls were adorned with symbolic decorations as well as the dates of birth and death.
[00:53:03.520 --> 00:53:13.360] The practice of digging up the remains and keeping the bones in the Binehaus continued until the 1960s, providing interested scientists with a wealth of data.
[00:53:13.360 --> 00:53:14.400] I mean, that's astonishing.
[00:53:14.400 --> 00:53:15.680] Somebody would even find that.
[00:53:15.680 --> 00:53:17.040] Oh, here's a good data set.
[00:53:17.040 --> 00:53:19.120] All right, what did they do with it from there?
[00:53:20.080 --> 00:53:26.240] Well, first off, I read somewhere that it's also a tourist mecca.
[00:53:26.240 --> 00:53:31.200] So that's probably why the scientists found it in the first place.
[00:53:31.520 --> 00:53:38.000] So what they did is I wanted to see what the heritability is of components of the skull.
[00:53:38.000 --> 00:53:54.000] So as I mentioned earlier, when we had this weird genetic fluke that prevented, or not prevented, that gave us more neurons because the neuronal stem cell started to divide more and more and more, producing more neurons.
[00:53:54.000 --> 00:53:55.440] The skull had to adapt.
[00:53:55.440 --> 00:54:07.360] So what these people wanted to find out is if you change one, if there's selection on one component of the skull, does that translate to the whole skull changing?
[00:54:07.360 --> 00:54:20.640] So basically, is the skull a collection of independent parts on which selection acts independently, or is it basically one structure?
[00:54:20.640 --> 00:54:25.200] So, selection acts on the same, on the whole structure.
[00:54:25.200 --> 00:54:27.920] And so, they looked at different components.
[00:54:28.160 --> 00:54:42.440] So, this base of the neck, basically, and all the different components of the skull, all the little bones that make the skull.
[00:54:43.080 --> 00:54:54.600] And they basically looked at individuals that were related, compared those different components, and looked at the heritability.
[00:54:54.600 --> 00:55:01.480] And what they found was that the skull basically changes as one.
[00:55:01.480 --> 00:55:08.920] So, if selection acts on one particular case, one particular part of the skull, the whole skull changes.
[00:55:08.920 --> 00:55:19.080] And that fitted my story because it means that when you have this expanding brain, that puts pressure on the base of the skull.
[00:55:19.080 --> 00:55:31.960] And that then, if you put selection on one part of the skull, the whole skull is able to change to compensate or to allow that large brain to grow.
[00:55:31.960 --> 00:55:37.240] And that's why we ended up with this very bulbous head, which you don't see when you have hair.
[00:55:37.240 --> 00:55:41.000] But my grandchild, of course, she doesn't have any hair.
[00:55:41.480 --> 00:55:44.760] She really has a very strange, large head.
[00:55:44.760 --> 00:55:51.880] It's funny that, you know, once you start reading and writing about these things, you notice it in little babies.
[00:55:51.880 --> 00:55:59.080] Babies have strange heads, and that head is all to do with having to fit that large brain in.
[00:55:59.080 --> 00:56:09.480] So they found good evidence that there's a genetic, that the it's heritable, which means that selection can act on the skull.
[00:56:09.480 --> 00:56:15.120] And then through natural selection, the skull can basically change, adapt.
[00:56:15.120 --> 00:56:15.600] Right.
[00:56:14.680 --> 00:56:16.240] So interesting.
[00:56:16.400 --> 00:56:18.720] Is that pleiotropy, I think?
[00:56:18.720 --> 00:56:25.760] Where you select for one particular characteristic and a bunch of others change with it that you don't even know or anticipate?
[00:56:26.080 --> 00:56:26.880] Exactly.
[00:56:26.880 --> 00:56:41.440] Like the silver foxes in Russia, that famous experiment, where they selected for docility, defined by, you know, how close can I get to the little silver fox before it nips at me, all the way up to sitting in my lap and I'm petting it.
[00:56:42.080 --> 00:56:54.960] But they also developed floppy ears and the little spot on the forehead and a curly tail and, you know, a bunch of other things that came along with the selection for just that one thing, docility.
[00:56:55.920 --> 00:57:02.400] Well, that's self-domestication, what Wrangham uses in his book, The Goodness Paradox.
[00:57:02.640 --> 00:57:03.280] Right, right.
[00:57:03.280 --> 00:57:20.480] So if you, so he claims that humans self-domesticate it as well, because, well, so the paradox, which I actually think, I think the idea is really good, interesting, although I don't think that that was the driving force behind our sociality.
[00:57:20.480 --> 00:57:36.080] So he claims that we had to, in order to have stable social groups, we had to get rid of people who don't adhere to the rules, say bullies or whatever.
[00:57:37.040 --> 00:57:43.120] So there was selection against highly aggressive individuals.
[00:57:44.720 --> 00:57:47.680] So that's basically the idea behind self-domestication.
[00:57:47.920 --> 00:57:51.680] You become nicer and kinder to other individuals.
[00:57:51.680 --> 00:57:56.880] But in order to get rid of those nasty pieces of work, they had to be killed.
[00:57:57.040 --> 00:58:09.000] So you had to also at the same time have to have the ability to extreme violence because you need executioners or people who kill other people in order for them not to kill other people, if you see what I mean.
[00:58:09.000 --> 00:58:10.520] So that's his paradox.
[00:58:10.520 --> 00:58:12.280] So the goodness paradox.
[00:58:12.280 --> 00:58:26.120] So he says, he claims that we self-domesticate it so that we are nice to individuals, even individuals to whom we are not related, which of course is the big problem, if you wish, in evolutionary biology.
[00:58:26.120 --> 00:58:32.040] Why are individuals of any species nice to someone with whom they don't share any genes?
[00:58:32.040 --> 00:58:36.360] Not so much nice, but why do they behave in an altruistic sense?
[00:58:36.360 --> 00:58:42.280] So have a behavior that's costly to themselves, but benefits someone else.
[00:58:42.280 --> 00:58:54.680] Darwin explained that by interacting with family, because if you do something to a family member at the expense of yourself, you're still helping your genes to be transmitted to the next generation.
[00:58:54.680 --> 00:59:00.920] But if the receiver or the recipient of your help is not related, then that's a problem.
[00:59:00.920 --> 00:59:07.480] And that's a problem that Rangham explains by self-domestications.
[00:59:07.480 --> 00:59:23.320] What I do find interesting is in all instances of where you find self-domestication across the animal kingdom, for example, you mentioned the silphal foxes, but dogs, etc., they all revert to a particular phenotype.
[00:59:23.960 --> 00:59:32.920] So that is, yes, that is pleiotropy that you select for one thing, and then you get this whole string of other characteristics.
[00:59:32.920 --> 00:59:33.320] Yeah.
[00:59:33.640 --> 00:59:43.560] This was always my criticism of the Nobel Prize sperm bank that that guy Graham created in San Diego.
[00:59:43.520 --> 00:59:47.440] And he's just selecting for one thing, you know, basically your score on an IQ test.
[00:59:47.440 --> 00:59:48.000] But what else?
[00:59:48.240 --> 00:59:50.240] But he was the only one that had sperm in it, wasn't he?
[00:59:44.920 --> 00:59:51.440] I don't think there was anyone.
[00:59:51.760 --> 00:59:53.280] There were two or three others.
[00:59:54.400 --> 00:59:57.600] Almost everybody he wrote, all Nobel Prize winners turned him down.
[00:59:57.600 --> 00:59:58.880] But yeah, there were a couple.
[00:59:58.880 --> 01:00:00.560] But it was mainly his, yeah.
[01:00:01.200 --> 01:00:06.560] Well, but the argument against it in principle is that you need genetic diversity.
[01:00:06.560 --> 01:00:20.320] And by being so careful in your eugenical selection, you're going to either get a bunch of weird stuff you're not looking for, or you'll miss out on some diversity that's actually good for the species or good for your nation or society.
[01:00:21.280 --> 01:00:36.400] Yeah, it's even more scary though, because if you believe Jeff Miller's the mating mind, so in it, so he shows this nice bell curve with intelligence for women and for men.
[01:00:36.400 --> 01:00:44.800] And the bell curves, the average totally overlaps, or most of the bell curve actually overlaps, but the tails are different.
[01:00:45.120 --> 01:00:49.680] So the tails for men are longer on both sides.
[01:00:49.680 --> 01:00:55.360] So you have more stupid, extremely stupid.
[01:00:55.680 --> 01:01:05.040] And his argument comes back from selection on so the male brain is, or the brain is successful.
[01:01:05.040 --> 01:01:06.880] Every business has an ambition.
[01:01:06.880 --> 01:01:16.320] PayPal Open is the platform designed to help you grow into yours with business loans so you can expand and access to hundreds of millions of PayPal customers worldwide.
[01:01:16.320 --> 01:01:22.640] And your customers can pay all the ways they want with PayPal, Venmo, Paylater, and all major cards.
[01:01:22.640 --> 01:01:24.720] So you can focus on scaling up.
[01:01:24.720 --> 01:01:29.520] When it's time to get growing, there's one platform for all business: PayPal Open.
[01:01:29.520 --> 01:01:32.200] Grow today at PayPalopen.com.
[01:01:29.760 --> 01:01:34.440] Loan subject to approval in available locations.
[01:01:34.760 --> 01:01:51.080] So you select a trait, which means that you, at the same time, you select for high intelligence, but the cost is that at the same time, I guess also through pleiotropy or whatever, you select for weird characteristics.
[01:01:51.080 --> 01:01:56.120] And he explains the higher prevalence of things like autism, etc.
[01:01:57.000 --> 01:01:59.720] is often linked to high intelligence.
[01:01:59.720 --> 01:02:05.160] So you may then have sperm from a Nobel Prize winner, but you don't know anything about this individual.
[01:02:05.160 --> 01:02:11.480] So you may end up with the psychopath as a child or someone with autism.
[01:02:12.120 --> 01:02:13.960] Anyway, we're kind of off topic there.
[01:02:13.960 --> 01:02:14.440] I wanted to.
[01:02:14.680 --> 01:02:22.040] Well, oh, so I was going to mention Trevor's theory of reciprocal altruism as why you would be solving the problem of altruism.
[01:02:22.040 --> 01:02:31.960] You know, I'm being nice to a stranger because, well, stranger, somebody in the other village, so that they'll be nice to me when times are hard for me or they're less likely to attack me if I was nice to them.
[01:02:31.960 --> 01:02:33.480] Something like that.
[01:02:34.440 --> 01:02:36.760] Yeah, no, that's a good example.
[01:02:37.400 --> 01:02:46.840] But then, of course, you do need to live in stable groups and you have to be able to recognize the other individual and remember that that person owes you something.
[01:02:47.240 --> 01:02:48.360] So I give an example.
[01:02:48.360 --> 01:02:55.640] It's an example I used to give in an animal behavior lecture of two young chimpanzee males.
[01:02:55.640 --> 01:02:58.760] So chimpanzees live in heroin group.
[01:02:58.760 --> 01:03:02.200] So you have the dominant male and mates with all the females.
[01:03:02.440 --> 01:03:05.000] But of course, there's the equal sex ratio.
[01:03:05.000 --> 01:03:18.000] So you always have young males, which at some stage will be either they replace the alpha male or they leave the group and try and establish a group themselves.
[01:03:18.320 --> 01:03:25.360] So this is an example that I found somewhere where there's two young males who want to mate with another female.
[01:03:25.360 --> 01:03:37.200] The female wants to mate with them, but that's tricky because if the dominant female finds out that his underlings are mating with one of his females, he won't be happy.
[01:03:37.200 --> 01:03:39.200] So they strike this deal.
[01:03:39.200 --> 01:03:44.800] So one of them distracts the male by, I don't know, throwing sticks at him or something.
[01:03:44.800 --> 01:03:52.480] The dominant male becomes completely enraged and races after the youngster that's teasing him.
[01:03:52.480 --> 01:03:57.840] The youngster runs in the opposite direction, so the other young male can mate with the female.
[01:03:57.840 --> 01:03:59.680] Next time, they reverse.
[01:03:59.680 --> 01:04:05.120] So now it's the one that mated the first time is now teasing the dominant male.
[01:04:05.120 --> 01:04:06.320] So they reciprocate.
[01:04:06.320 --> 01:04:11.200] And I think that's a great example of reciprocation.
[01:04:11.200 --> 01:04:20.160] But of course, it only works, as I said earlier, if you remember the other individual and if you can come up with a scheme in the first place.
[01:04:20.160 --> 01:04:20.720] Right.
[01:04:21.280 --> 01:04:23.760] So you have to have high social skills in order to do that.
[01:04:23.920 --> 01:04:27.680] Well, there's those vampire bats that share blood, right?
[01:04:28.000 --> 01:04:32.480] So they're keeping track somehow of who was a good cooperator and who wasn't.
[01:04:33.120 --> 01:04:33.680] Yes.
[01:04:33.680 --> 01:04:36.800] And I think, don't they always roost in the same place?
[01:04:36.800 --> 01:04:38.320] They have a communal roost at night.
[01:04:38.720 --> 01:04:39.040] Right.
[01:04:39.040 --> 01:04:39.680] Yeah.
[01:04:39.680 --> 01:04:40.080] Yeah.
[01:04:40.800 --> 01:04:43.360] And they must be related.
[01:04:43.680 --> 01:04:52.240] Well, and again, I mean, so our sense of morality just builds on something that came from our ancestors that you can still find in chimpanzees and other social animals.
[01:04:52.240 --> 01:04:53.120] Yeah, yeah, yeah.
[01:04:53.440 --> 01:05:00.680] So, it's not something that's uniquely human again and that's God-given or whatever, since I'm also an atheist.
[01:05:00.680 --> 01:05:01.160] Yeah.
[01:04:59.840 --> 01:05:05.880] A couple of the things you write about here that made me think of some interesting topics.
[01:05:06.200 --> 01:05:08.680] Mind reading, theory of mind.
[01:05:08.840 --> 01:05:15.720] You mentioned Alison Gopnik has that famous experiment from the 80s with the little box of candles or crayons.
[01:05:15.720 --> 01:05:20.200] The box of crayons, you show a three-year-old a box of crayons, you know, what's in there, crayons.
[01:05:20.200 --> 01:05:23.480] So she opens it up, and there's candles in the box of crayons.
[01:05:23.560 --> 01:05:25.160] Like, oh, surprise.
[01:05:25.160 --> 01:05:32.840] Now you put it away, and then you bring in another child, and you ask the one that knows what's in there, you know, what's he going to think is in there?
[01:05:32.840 --> 01:05:33.800] Oh, candles.
[01:05:33.800 --> 01:05:35.000] Like, no.
[01:05:35.640 --> 01:05:42.760] You know, and then by age four or five, I guess they develop a theory of mind and they're able to pass the candle, the theory of mind test, something like that.
[01:05:42.760 --> 01:05:46.920] So that's an interesting twist on this whole origins of language.
[01:05:46.920 --> 01:05:53.960] At some point, maybe age four, I think Allison said, is when mind reading comes on online.
[01:05:55.480 --> 01:05:57.880] It's not so much mind reading, though, is it?
[01:05:58.200 --> 01:06:03.800] It's understanding what the other person is thinking, which I don't think is quite the same as mind reading.
[01:06:03.800 --> 01:06:09.160] Because if you look at some of the work of Tomasello's work on little toddlers.
[01:06:09.320 --> 01:06:10.280] Yeah, you wrote about that.
[01:06:10.280 --> 01:06:11.960] Yeah, that was super interesting.
[01:06:12.360 --> 01:06:15.720] These videos are so lovely.
[01:06:17.880 --> 01:06:23.640] So you have the caretaker sits in a corner, just quietly doing her thing.
[01:06:23.960 --> 01:06:29.080] The little toddler is exploring the area.
[01:06:29.080 --> 01:06:30.360] And then a man comes in.
[01:06:30.360 --> 01:06:34.280] A stranger comes in carrying a pile of books and papers.
[01:06:34.280 --> 01:06:36.440] And there's a cabinet with a closed door.
[01:06:36.440 --> 01:06:44.040] So the stranger walks towards the cabinet and pushes the pile of books and papers against the closed door.
[01:06:44.040 --> 01:06:47.840] And this toddler is thinking, What's this man doing?
[01:06:48.160 --> 01:06:53.360] Because doesn't he know you have to open the door before you can put the books in the shelf?
[01:06:53.360 --> 01:06:55.680] So I think it can't even walk yet.
[01:06:55.680 --> 01:07:04.560] So it crawls over to the cabinet, stands up, opens the door, and then really looks at this stranger.
[01:07:04.560 --> 01:07:06.080] Is this what you want?
[01:07:07.120 --> 01:07:08.880] So that's mind reading.
[01:07:08.880 --> 01:07:09.280] Yeah.
[01:07:09.600 --> 01:07:11.040] It's just a different level.
[01:07:11.440 --> 01:07:14.480] So I think it's well, yeah.
[01:07:14.480 --> 01:07:19.600] So mind reading starts from the beginning, I think, almost when they're born, because they have to.
[01:07:20.080 --> 01:07:28.000] Because they have to know, know, if I do this, then I get what I need.
[01:07:28.960 --> 01:07:33.360] And what they need, of course, is really important because they're totally dependent.
[01:07:34.000 --> 01:07:42.160] So I can see that there's a jump, what you just explained, the jump in comprehension.
[01:07:42.160 --> 01:07:48.640] They can't know what's in the books because they haven't looked in the books, and the book says it's crayons, but I know there's candles in it.
[01:07:48.640 --> 01:07:50.720] So that's a jump in development.
[01:07:50.720 --> 01:07:53.520] But the mind reading starts much earlier.
[01:07:53.600 --> 01:07:55.040] Yeah, that's a good point.
[01:07:55.040 --> 01:07:55.760] Right.
[01:07:55.760 --> 01:07:57.280] Yeah, you've probably seen that research.
[01:07:57.280 --> 01:08:02.960] I think it's Greg Burns on dogs, mind reading versus chimps.
[01:08:02.960 --> 01:08:08.480] So, like, dogs evolved with humans in social groups, so they learn to read the cues from humans.
[01:08:08.480 --> 01:08:13.040] So the experiment is they have like an opaque little box with food in it or not.
[01:08:13.040 --> 01:08:16.400] And, you know, the experimenter points or just looks.
[01:08:16.400 --> 01:08:25.040] And the dogs are better at reading the pointing or the eye gaze than the chimps are of the humans because they learned, they evolved to be able to do that.
[01:08:25.040 --> 01:08:25.760] Yep.
[01:08:25.760 --> 01:08:26.800] That's a kind of mind reading.
[01:08:27.120 --> 01:08:28.640] Dogs are pretty good at mind reading.
[01:08:29.200 --> 01:08:31.080] Oh, my dog is totally into it.
[01:08:31.080 --> 01:08:32.760] Yeah, he knows what I'm going to do.
[01:08:32.760 --> 01:08:38.200] If I get my bike out, if I get my bike shoes out, he knows, oh, we're not going for a walk.
[01:08:38.360 --> 01:08:39.640] He crawls back to his little base.
[01:08:40.280 --> 01:08:43.560] But if I got my tennis shoes, he's like, okay, here we go.
[01:08:44.520 --> 01:08:50.440] No, but again, because the dog can't open the can of dog food, so they need the human.
[01:08:50.440 --> 01:08:53.480] So they're well attuned to understanding what the humans want.
[01:08:53.480 --> 01:08:54.120] Right.
[01:08:54.120 --> 01:08:54.440] All right.
[01:08:54.440 --> 01:08:55.800] You mentioned the film Arrival.
[01:08:55.800 --> 01:09:03.240] I love that film because finally it was a social scientist that solved the problem rather than a physicist of communicating with the aliens.
[01:09:05.080 --> 01:09:20.120] So, but that brings to mind, you know, sometimes presumably if we discovered ETIs, like SETI was successful, they would probably have some form of communication because you have to have a social species to be able to master space travel and so on.
[01:09:20.120 --> 01:09:22.920] So, therefore, you need some kind of communication.
[01:09:23.560 --> 01:09:31.320] I would say it has to be very sophisticated communication because otherwise they couldn't organize themselves or build spacecraft, etc.
[01:09:31.560 --> 01:09:32.200] Right.
[01:09:32.200 --> 01:09:32.680] Yeah.
[01:09:33.320 --> 01:09:35.000] But how would you know how to communicate?
[01:09:35.000 --> 01:09:36.200] That's always an interesting problem.
[01:09:36.200 --> 01:09:42.440] Remember the group that Sagan was a part of this group, the group, the dolphin, what was it?
[01:09:43.800 --> 01:09:45.880] There was a group of dolphin studiers.
[01:09:45.880 --> 01:09:46.440] Oh, John C.
[01:09:46.520 --> 01:09:50.120] Lilly's group, in which they were trying to communicate with dolphins.
[01:09:50.120 --> 01:09:56.040] This is when Sagan was, you know, first took up the idea of looking for aliens and discovered we don't even know what the dolphins are saying.
[01:09:56.040 --> 01:09:58.200] How are we going to know what the aliens are saying?
[01:09:59.160 --> 01:10:03.840] It's part of the experiment where this woman lifts with a dolphin in a half watched house.
[01:10:04.360 --> 01:10:04.760] Yes.
[01:10:05.160 --> 01:10:05.800] Yeah.
[01:10:05.800 --> 01:10:14.280] Yeah, I think the dolphin got kind of fond of her, sort of like Nimchimsky getting fond of the female trainer and jealous of the males around her and so on.
[01:10:14.280 --> 01:10:18.080] Yes, I remember a podcast again, yes, about that.
[01:10:18.080 --> 01:10:18.560] Oh, you do?
[01:10:18.560 --> 01:10:19.440] Okay, yeah, interesting.
[01:10:19.440 --> 01:10:20.320] You listen to podcasts.
[01:10:20.400 --> 01:10:21.680] Yeah, I think it was a radio lab.
[01:10:14.840 --> 01:10:22.240] It's been a while.
[01:10:22.480 --> 01:10:23.200] Oh, all right.
[01:10:23.200 --> 01:10:23.840] Right.
[01:10:24.160 --> 01:10:29.120] But then Lily gave dolphins LSD, and that kind of messed up the experiment, I guess.
[01:10:29.120 --> 01:10:30.240] I wonder why.
[01:10:32.080 --> 01:10:33.120] All right, finally, what?
[01:10:33.120 --> 01:10:34.400] Oh, no, two last things.
[01:10:34.400 --> 01:10:39.520] One on you, you mentioned AI and large language models and so on.
[01:10:40.240 --> 01:10:44.160] Is AI developing a new form of language and communication?
[01:10:45.120 --> 01:10:51.920] I can't see how because it's basically plagiarizing what we've done or what we're doing.
[01:10:52.560 --> 01:10:56.240] I actually read, she's an author.
[01:10:56.240 --> 01:10:59.120] I can't remember who it is.
[01:10:59.120 --> 01:11:00.160] Anna Funder.
[01:11:00.160 --> 01:11:03.120] Yes, I read an essay by her the other day.
[01:11:03.120 --> 01:11:13.440] And well, first off, she's very cross that all her work is basically being used to train AI without authors having given permission.
[01:11:13.440 --> 01:11:17.040] And I think that's the case for all authors because no one's given permission.
[01:11:17.040 --> 01:11:24.880] But if your work is available online, then AI can use your work to train.
[01:11:24.880 --> 01:11:29.280] So she makes that point that that is actually rather unethical.
[01:11:29.600 --> 01:11:43.520] And then the other point that she makes is: when I write, I'm not speaking as an afunder, when I write, I always try and come up with words that you don't normally see in this particular order.
[01:11:43.520 --> 01:11:58.720] Because what AI does, it will always use the words in a particular order, the words that are most frequently used in that particular order, because that's what it's how it's trained, because it looks at all the commonalities.
[01:11:59.040 --> 01:12:11.400] So she makes the really good argument that AI will never be able to come up with some creative way of using language because that's just not how the algorithm works.
[01:12:11.400 --> 01:12:17.000] For that to work, you have to then assume that it's going to become conscious, whatever that means.
[01:12:17.000 --> 01:12:26.120] I have a real difficulty with the whole word of consciousness and become something akin to something alive.
[01:12:26.120 --> 01:12:27.640] And I don't see that happen.
[01:12:27.640 --> 01:12:28.280] Yeah.
[01:12:28.520 --> 01:12:33.240] Okay, does it take a village that is raising children has always required a community?
[01:12:33.240 --> 01:12:40.200] The nuclear family, fueled by cultural change and capitalism, isolates parents, removing the ancestral support system.
[01:12:40.200 --> 01:12:45.560] Without grandparents or extended family nearby, we must find new ways to support parents and children.
[01:12:45.560 --> 01:12:46.040] Okay.
[01:12:46.040 --> 01:12:57.320] So just first, is there something biological or evolutionary about the nuclear family or a male and a female and their children or something like that?
[01:12:57.320 --> 01:13:01.560] Or is that more of a cultural invention in the early modern period?
[01:13:01.560 --> 01:13:03.720] Or how do you think about all that?
[01:13:04.680 --> 01:13:12.760] Well, of course, the family is a unit of selection, if you wish, because they're all related, so they all share genes.
[01:13:12.760 --> 01:13:15.320] So in that respect, it's a biological entity.
[01:13:15.320 --> 01:13:22.760] But I think what I'm more referring to is the idea that mothers in particular are the ones that have to raise their children.
[01:13:22.760 --> 01:13:29.880] And that comes from the Industrial Revolution, where dad goes out to work and the mum stays at home.
[01:13:30.200 --> 01:13:35.240] But also if you and we still s don't seem to be able to quite get over that.
[01:13:35.240 --> 01:13:45.000] So if you look at new developments, so here in Australia, because of huge immigration rates, they're constantly building new houses.
[01:13:45.120 --> 01:13:54.640] It's also because Australians want to have their detached home on a, I don't know, two and a half acre block or whatever, they say.
[01:13:54.640 --> 01:13:59.600] So we have this sprawling suburbs which are not connected to anything.
[01:13:59.600 --> 01:14:03.840] There's no schools, there's no shops, there's no place to work.
[01:14:03.840 --> 01:14:05.520] So what happens?
[01:14:05.520 --> 01:14:20.480] Mainly the men, mostly the men, they go off to work and the woman is alone with the child or the children, totally isolated from anyone else because, well, I guess she's got other mothers and children in the same neighbourhood.
[01:14:20.480 --> 01:14:21.840] That is what I refer.
[01:14:21.920 --> 01:14:25.360] That's what I think is totally unnatural.
[01:14:26.160 --> 01:14:31.520] Because, you know, the kids have to be able to run out and bump into other kids.
[01:14:31.520 --> 01:14:35.360] There are other individuals that they interact with.
[01:14:35.680 --> 01:14:45.840] Of course, need a village to raise a child is a point that Sarah Hurdie, Blaffer Hurdy, made in her books.
[01:14:46.000 --> 01:14:46.560] Mother and Mario.
[01:14:46.720 --> 01:14:49.440] But I think originally Mother Nature.
[01:14:49.760 --> 01:14:54.320] But I think it mainly Mothers and Others is another book that I really enjoyed.
[01:14:55.200 --> 01:15:03.200] But I think it comes from an older saying that she co-opted Inuits, maybe?
[01:15:03.200 --> 01:15:04.320] I can't quite remember.
[01:15:04.320 --> 01:15:09.440] I used to have it on the slide in one of my lectures, but I can't remember.
[01:15:12.400 --> 01:15:18.160] It's yeah, so the natural family, if you wish, I think are kin groups.
[01:15:18.160 --> 01:15:20.560] So, extended families.
[01:15:21.200 --> 01:15:23.120] So, that's how Lucy, etc.
[01:15:23.440 --> 01:15:24.720] Probably started.
[01:15:24.720 --> 01:15:25.120] Yeah.
[01:15:25.440 --> 01:15:26.000] Yeah.
[01:15:26.040 --> 01:15:27.280] Well, aunties.
[01:15:27.520 --> 01:15:30.040] Right, that gets back to the thesis of your book.
[01:15:30.040 --> 01:15:32.040] That's what you need language for, right?
[01:15:32.360 --> 01:15:33.160] All right, Madeline.
[01:15:33.160 --> 01:15:33.560] That's great.
[01:15:29.440 --> 01:15:34.360] That's a good place to stop.
[01:15:34.520 --> 01:15:37.480] The origin of language: how we learn to speak and why.
[01:15:37.480 --> 01:15:38.440] There it is.
[01:15:45.480 --> 01:15:48.360] This episode is brought to you by Progressive Insurance.
[01:15:48.360 --> 01:15:50.920] You chose to hit play on this podcast today.
[01:15:50.920 --> 01:15:51.880] Smart choice.
[01:15:51.880 --> 01:15:58.120] Make another smart choice with AutoQuote Explorer to compare rates from multiple car insurance companies all at once.
[01:15:58.120 --> 01:15:59.880] Try it at progressive.com.
[01:15:59.880 --> 01:16:01.960] Progressive casualty insurance company and affiliates.
[01:16:01.960 --> 01:16:03.720] Not available in all states or situations.
[01:16:03.720 --> 01:16:05.640] Prices vary based on how you buy.
[01:16:05.640 --> 01:16:11.080] What I would say to someone who's considering FDU is to come because they give great scholarship opportunities.
[01:16:11.080 --> 01:16:12.840] The professors are very helpful.
[01:16:12.840 --> 01:16:15.720] They want you to learn and they want you to pass.
[01:16:15.720 --> 01:16:18.280] And FDU gives you a second family.
[01:16:18.280 --> 01:16:21.320] Seize the moment and change your world at FDU.