Debug Information
Processing Details
- VTT File: mss543_Jean_Twenge_2025_09_09.vtt
- Processing Time: September 09, 2025 at 06:07 PM
- Total Chunks: 1
- Transcript Length: 52,110 characters
- Caption Count: 554 captions
Prompts Used
Prompt 1: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 1 of 1 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
[00:00:00.880 --> 00:00:08.720] A mochi moment from Tara, who writes, For years, all my doctor said was eat less and move more, which never worked.
[00:00:08.720 --> 00:00:10.080] But you know what does?
[00:00:10.080 --> 00:00:13.280] The simple eating tips from my nutritionist at Mochi.
[00:00:13.280 --> 00:00:19.920] And after losing over 30 pounds, I can say you're not just another GLP1 source, you're a life source.
[00:00:19.920 --> 00:00:21.040] Thanks, Tara.
[00:00:21.040 --> 00:00:23.600] I'm Myra Ammet, founder of Mochi Health.
[00:00:23.600 --> 00:00:27.360] To find your mochi moment, visit joinmochi.com.
[00:00:27.360 --> 00:00:30.880] Tara is a mochi member compensated for her story.
[00:00:31.200 --> 00:00:36.800] 16 years from today, Greg Gerstner will finally land the perfect cannonball.
[00:00:37.120 --> 00:00:52.000] Epic Splash, Unsuspecting Friends, a work of art only possible because Greg is already meeting all these same people at AARP volunteer and community events that keep him active and involved and help make sure his happiness lives as long as he does.
[00:00:52.000 --> 00:00:55.920] That's why the younger you are, the more you need AARP.
[00:00:55.920 --> 00:00:59.760] Learn more at aarp.org/slash local.
[00:01:03.920 --> 00:01:09.600] You're listening to The Michael Shermer Show.
[00:01:16.960 --> 00:01:21.120] All right, hey, everybody, it's Michael Shermer, and it's time for another episode of the Michael Shermer Show.
[00:01:21.120 --> 00:01:40.640] My returning champion guest today is Jean Twangy, a professor of psychology at San Diego State University, and the author of more than 190 scientific publications and several books based on her research, including Generation Me, iGen, and Generations, which we discussed on the show.
[00:01:40.640 --> 00:01:42.000] That was her previous book.
[00:01:42.000 --> 00:01:48.560] Her research has been covered in Time, The Atlantic, Newsweek, The New York Times, USA Today, and The Washington Post.
[00:01:48.560 --> 00:01:56.240] She's been featured on The Today Show, Good Morning America, Fox and Friends, CBS, This Morning, Real Time with Bill Maher and NPR.
[00:01:56.240 --> 00:02:06.840] She lives in San Diego with her husband and three daughters, so she knows from what she speaks about her new book, 10 Rules for Raising Kids in a High-Tech World.
[00:02:06.840 --> 00:02:08.280] All right, Gene, nice to see you again.
[00:02:08.280 --> 00:02:09.240] How are you?
[00:02:09.240 --> 00:02:09.800] I'm good.
[00:02:09.800 --> 00:02:10.520] How are you doing?
[00:02:10.520 --> 00:02:11.800] You're on your book tour this week.
[00:02:11.800 --> 00:02:18.040] I have to tell you, I don't know how many times you've ever heard my show, but I plug your generations book all the time.
[00:02:18.040 --> 00:02:31.720] It is one of the most important books I have ever read, honestly, because you have a theoretical model overarching all the specifics that kind of tie them together, namely this life, what is it, life history theory that I really liked.
[00:02:31.720 --> 00:02:32.920] It explains a lot.
[00:02:32.920 --> 00:02:33.080] Right.
[00:02:33.400 --> 00:02:34.200] It explains a lot.
[00:02:34.200 --> 00:02:39.960] And since your book came out, I think it's pretty clear that the problem is no longer debatable.
[00:02:39.960 --> 00:02:41.080] It really is a problem.
[00:02:41.080 --> 00:02:51.880] The link between social media technology and the spike in depression, anxiety, suicidal ideation, unhappiness, and so on, right?
[00:02:52.520 --> 00:02:53.880] Yeah, absolutely.
[00:02:53.880 --> 00:03:02.440] And as you know, you know, it started with seeing those troubling trends in adolescent depression and then wondering what caused them.
[00:03:02.440 --> 00:03:08.120] So, you know, that was back in 2017 when I first started to make that argument.
[00:03:08.120 --> 00:03:11.560] And just because nothing else really seemed to fit.
[00:03:11.560 --> 00:03:14.200] And in the eight years since, that's still true.
[00:03:14.200 --> 00:03:20.200] Really looks like that is the primary cause, which then begs the question, what do we do about it?
[00:03:20.200 --> 00:03:20.680] Yeah.
[00:03:21.160 --> 00:03:23.960] Let me just read a couple of these stories from your book here.
[00:03:24.120 --> 00:03:30.440] When Alex Alexis Spence was 11, she opened an Instagram account on her iPad without her parents knowing.
[00:03:30.440 --> 00:03:37.400] As Alexis consumed more and more thinspiration content, I didn't know that word, thinspiration, and see where that's going.
[00:03:37.400 --> 00:03:39.000] She'd began starving herself.
[00:03:39.000 --> 00:03:46.880] By 15, she had developed a severe eating disorder and spent time in a residential treatment center as she battled anorexia and suicidal thoughts.
[00:03:47.200 --> 00:03:52.800] Alexis survived, but still struggles with mental health issues and was not able to leave home for college.
[00:03:52.960 --> 00:03:56.000] Let's just kind of deconstruct that one a bit.
[00:03:56.000 --> 00:03:58.320] What's the causal mechanism going on there?
[00:03:58.320 --> 00:04:01.760] Just comparison to other people that no one can compare to?
[00:04:02.640 --> 00:04:03.680] There's several.
[00:04:03.680 --> 00:04:06.080] There are really so many causal mechanisms.
[00:04:06.400 --> 00:04:18.240] It's why I came to realize that that explanation about smartphones and social media, you know, leading to the adolescent mental health crisis had some meat to it because there's so many possible mechanisms.
[00:04:18.240 --> 00:04:21.520] So one is just displacement.
[00:04:21.520 --> 00:04:28.400] If you're spending hours and hours and hours on social media, then you're probably not sleeping enough.
[00:04:28.400 --> 00:04:32.000] You're probably not seeing friends and family in person as much.
[00:04:32.000 --> 00:04:34.080] You're probably not getting outside.
[00:04:34.080 --> 00:04:37.440] You're not having time to just sit and think.
[00:04:37.680 --> 00:04:47.840] And then in Alexis's case, it was that social comparison of these often photoshopped bodies, airbrushed bodies on Instagram.
[00:04:47.840 --> 00:04:50.320] And who can compare to that?
[00:04:50.320 --> 00:04:50.880] No one.
[00:04:50.880 --> 00:05:01.040] And so many teen girls and young women are looking at those perfect images, those perfect bodies on Instagram, and thinking they can never live up to that.
[00:05:01.040 --> 00:05:05.680] I mean, that's what Meta's internal research found, too, when they had focus groups and surveys and so on.
[00:05:05.680 --> 00:05:09.040] They found that over and over, especially for girls and young women.
[00:05:09.040 --> 00:05:09.600] Yeah.
[00:05:09.920 --> 00:05:12.960] And speaking of that, why didn't they do something about it?
[00:05:12.960 --> 00:05:15.840] Or is it, you know, their mission is to make profits.
[00:05:15.840 --> 00:05:19.920] So what do we care what happens to their customers?
[00:05:20.240 --> 00:05:20.800] Yep.
[00:05:20.800 --> 00:05:22.400] That seems to be the case.
[00:05:22.400 --> 00:05:26.720] And that's why there's probably one of the reasons why there's a lot of lawsuits out there.
[00:05:26.720 --> 00:05:32.440] Once those internal documents got leaked, that started the flood of lawsuits.
[00:05:29.840 --> 00:05:37.000] That's actually how we know Alexis's story is that she's part of those lawsuits.
[00:05:38.600 --> 00:05:46.920] A black mailer posing as a teen girl talked a 15-year-old Utah boy into sending nude pictures of himself on Snapchat.
[00:05:46.920 --> 00:05:52.760] The user threatened to send screenshots of the images to his friends and family unless he paid $200.
[00:05:52.760 --> 00:05:55.640] He was so devastated, he took his own life.
[00:05:55.640 --> 00:05:58.520] So there's an example of it happened to boys.
[00:05:58.760 --> 00:06:06.760] One last example: Selena Rodriguez opened an Instagram account when she was 10 and before long was using it at all hours of the day and night.
[00:06:06.760 --> 00:06:09.400] When she wasn't on Instagram, she was on Snapchat.
[00:06:09.400 --> 00:06:12.680] Several adult men asked her to send nude pictures and videos of herself.
[00:06:12.680 --> 00:06:16.040] Selena was eventually hospitalized for depression and self-harm.
[00:06:16.040 --> 00:06:18.200] At the age of 11, she died by suicide.
[00:06:18.200 --> 00:06:22.840] Now, you make it clear, these are not, this doesn't always happen, but it can happen.
[00:06:23.160 --> 00:06:23.720] Right.
[00:06:23.720 --> 00:06:24.920] And I have to be clear about that.
[00:06:24.920 --> 00:06:26.840] You know, these are extreme examples.
[00:06:26.840 --> 00:06:31.800] They are really, really unfortunate outcomes, but they're also more common than you might think.
[00:06:31.800 --> 00:06:41.000] So that thing that happened to the Utah boy, sex stortion of you're going to send a nude picture and then the blackmailer asks for money.
[00:06:41.000 --> 00:06:45.960] Snapchat gets, I think it's 10,000 reports a month of this.
[00:06:45.960 --> 00:06:46.440] Oh, my God.
[00:06:46.520 --> 00:06:51.880] And the company themselves acknowledges that that's probably just the tip of the iceberg.
[00:06:51.880 --> 00:06:55.000] They don't think all these are even reported.
[00:06:55.000 --> 00:06:57.080] So these things are scary.
[00:06:57.080 --> 00:06:58.920] They are extreme.
[00:06:58.920 --> 00:07:14.880] But there are these lower-level things that happen, too, that are even more common of just kids spending so much time on social media that, as I mentioned, it interferes with other activities, that they're unhappy, that they're comparing themselves.
[00:07:14.880 --> 00:07:17.360] There's just a litany of problems.
[00:07:17.360 --> 00:07:17.760] Yeah.
[00:07:14.360 --> 00:07:21.040] Yeah, I can see how easy youngsters could fall for that.
[00:07:21.120 --> 00:07:24.560] You know, I've been getting, I don't know, the last six months or so, these random texts.
[00:07:24.560 --> 00:07:25.600] You've probably gotten them.
[00:07:25.600 --> 00:07:28.240] Hey, are we still having dinner next week?
[00:07:28.560 --> 00:07:30.320] You know, sorry, you got the wrong number.
[00:07:30.320 --> 00:07:31.600] Oh, I'm so sorry.
[00:07:31.760 --> 00:07:32.960] Maybe we could be friends.
[00:07:32.960 --> 00:07:38.320] And then before, you know, sometimes I drag it out for a while and get them to, they'll send me a picture.
[00:07:38.320 --> 00:07:40.240] And it's some like really attractive woman.
[00:07:40.240 --> 00:07:45.360] And I have to keep reminding myself, this is some dude somewhere just sending me pictures, right?
[00:07:45.360 --> 00:07:46.080] But it's hard.
[00:07:46.080 --> 00:07:50.160] Even in my mind, I'm thinking, I wonder if, I wonder what I should say to her next.
[00:07:50.160 --> 00:07:51.760] It's like, no, sure, him.
[00:07:52.720 --> 00:07:55.040] You're not talking to this good-looking woman.
[00:07:55.040 --> 00:07:57.280] So sometimes I'll drag it out as long as I can.
[00:07:57.280 --> 00:08:04.240] Like one of them, I said, hey, I thought we agreed not to talk until the police have cooled off.
[00:08:04.240 --> 00:08:07.520] Or did you remember to ditch the gun and hide the body?
[00:08:07.680 --> 00:08:09.040] Stuff like that.
[00:08:09.920 --> 00:08:12.480] But I've never dragged it out long enough to know what the ask is.
[00:08:12.480 --> 00:08:16.240] You know, it's probably a gift card or Bitcoin or I don't know what.
[00:08:17.120 --> 00:08:20.800] But for sure, if I was 15 and didn't know, and here's, oh, here's a cute girl.
[00:08:20.800 --> 00:08:22.080] I think I'll send her a picture.
[00:08:22.400 --> 00:08:24.400] And it's just some dude somewhere, right?
[00:08:24.720 --> 00:08:25.440] That's right.
[00:08:25.440 --> 00:08:26.240] That's right.
[00:08:26.240 --> 00:08:26.560] Yeah.
[00:08:26.560 --> 00:08:27.920] And there's all kinds of stuff like that.
[00:08:27.920 --> 00:08:28.720] It's not just that.
[00:08:28.720 --> 00:08:42.080] There's also, you know, journalists have done incredible work, you know, uncovering some of this stuff, like the evil group on Discord that has blackmailed teen girls and boys and doing all kinds of crazy stuff.
[00:08:42.640 --> 00:08:44.400] It's really awful out there.
[00:08:44.400 --> 00:08:47.440] How do they blackmail him to get like the example of the $200?
[00:08:47.440 --> 00:08:51.200] I mean, a kid's not going to go to his parents and say, I need $200 to send to this person.
[00:08:51.200 --> 00:08:53.600] Is it like gift card purchases or what?
[00:08:53.600 --> 00:08:57.600] I'm not sure exactly how they do it, but it's often around nude pictures.
[00:08:57.600 --> 00:09:02.600] So if you don't send me this money, I will send this picture to all of your friends and family.
[00:09:02.600 --> 00:09:03.400] I see.
[00:09:03.400 --> 00:09:03.880] Wow.
[00:09:03.880 --> 00:09:05.080] Incredible.
[00:08:59.760 --> 00:09:06.600] Okay, some of the other mechanisms.
[00:09:06.920 --> 00:09:10.280] So sleep deprivation, teens need about nine hours.
[00:09:10.280 --> 00:09:12.760] They're now getting less than seven hours.
[00:09:13.400 --> 00:09:21.480] Yeah, so sleep deprivation started to spike again right around 2012, right-of-smart phones and social media became common.
[00:09:21.480 --> 00:09:23.240] And it makes sense when you think about it.
[00:09:23.240 --> 00:09:30.840] It's just so tempting to stay up late scrolling through social media, responding to texts, staying on those devices.
[00:09:31.000 --> 00:09:40.040] Common Sense Media did a study and found six out of 10 11 to 17 year olds was using their phone between midnight and 5 a.m.
[00:09:40.040 --> 00:09:41.400] on school night.
[00:09:41.400 --> 00:09:42.040] Wow.
[00:09:42.040 --> 00:09:44.920] The parents are just asleep in the other room.
[00:09:44.920 --> 00:09:45.640] Yes.
[00:09:45.640 --> 00:09:48.920] And they should be asleep, but their kids aren't.
[00:09:48.920 --> 00:09:50.920] And they're using their phones in the middle of the night.
[00:09:50.920 --> 00:09:52.440] I mean, that's rule number two in the book.
[00:09:52.440 --> 00:09:56.200] And I say it's the most important one: no devices in the bedroom overnight.
[00:09:56.200 --> 00:09:56.680] Right.
[00:09:56.920 --> 00:10:03.640] So just like when you sometimes you go to comedy clubs and they take your phone at the door and they put it in a bag and you pick it up when you're done.
[00:10:03.640 --> 00:10:04.920] Something like that.
[00:10:05.240 --> 00:10:07.000] Well, schools are using those now.
[00:10:07.000 --> 00:10:21.800] So the yonder pouches, so those got their start at concerts and comedy clubs, and then they expanded and now they're even more often being used at schools because that's been, thankfully, the new movement of no phones during the school day bell to bell.
[00:10:22.120 --> 00:10:22.760] Yeah.
[00:10:23.080 --> 00:10:28.040] I'm the perfect host for you because I have a nine-year-old.
[00:10:28.040 --> 00:10:32.040] Now he's not into social media, doesn't have a smartphone or anything, but the gaming tablets.
[00:10:32.040 --> 00:10:38.200] So he's into Minecraft and something called Geometry Dash, which is really wild.
[00:10:38.200 --> 00:10:40.360] But, you know, he could sit there for hours.
[00:10:40.360 --> 00:11:00.160] It's endless and and or when i'll drop him off at his buddy's house and go out for a two-hour bike ride and come back and they're still there you know with the devices it's like oh my god so i mean we have to like physically take it out of his hand you can just see the addiction he just can't stop wow it's really powerful Yeah, it absolutely is.
[00:11:00.160 --> 00:11:07.760] And, you know, gaming is interesting because it's not as obviously toxic as social media.
[00:11:07.760 --> 00:11:18.720] It's often done either with somebody in the same room or with or remotely with friends, which in real time, again, unlike social media, but still you have to set the limits for it.
[00:11:18.720 --> 00:11:25.360] You know, it can't be something that takes over the kid's life or, you know, is interfering with sleep or homework and so on.
[00:11:25.360 --> 00:11:30.000] And too often, that's what it becomes because of that feeling of addiction.
[00:11:30.000 --> 00:11:32.160] And it's reinforcing because it's also social.
[00:11:32.160 --> 00:11:36.960] I had no idea these games, you could see other people, your friends, playing the game right now.
[00:11:37.520 --> 00:11:40.640] Again, my son is like, hey, look, but so-and-so is on there right now.
[00:11:40.640 --> 00:11:41.920] I'm like, what?
[00:11:42.560 --> 00:11:44.160] And then that just keeps it going.
[00:11:44.160 --> 00:11:56.720] So just corporate-wise, these devices, games, social media platforms, they are really literally designed to keep eyeballs focused as long as possible because that's the business model.
[00:11:56.720 --> 00:11:57.760] Yeah, absolutely.
[00:11:57.760 --> 00:12:02.480] So that's why they poured, you know, millions, probably billions of dollars into these algorithms.
[00:12:02.480 --> 00:12:07.520] Looking for a smarter way to teach your child to ride a bike and support American jobs at the same time?
[00:12:07.520 --> 00:12:11.680] Most kids' bikes are cheap imports, heavy, clunky, and hard for kids to control.
[00:12:11.680 --> 00:12:13.840] Guardian Bikes is changing that.
[00:12:13.840 --> 00:12:17.520] Assembling bikes right here in the USA, with plans for full U.S.
[00:12:17.520 --> 00:12:19.440] manufacturing in the next few months.
[00:12:19.440 --> 00:12:23.520] It's a commitment to higher quality and American craftsmanship you can trust.
[00:12:23.520 --> 00:12:28.120] Each bike is lightweight, low to the ground, and built to help kids learn to ride faster.
[00:12:28.120 --> 00:12:29.280] Many in just one day.
[00:12:29.280 --> 00:12:30.600] No training wheels needed.
[00:12:29.840 --> 00:12:32.840] Guardian's patented sure stop braking system.
[00:12:33.080 --> 00:12:40.840] One lever stops both wheels, giving your child more control, faster stops, and prevents those scary head-over-handlebar accidents.
[00:12:40.840 --> 00:12:41.720] It's so easy.
[00:12:41.720 --> 00:12:43.320] Even a two-year-old can do it.
[00:12:43.320 --> 00:12:48.600] If you're ready to support American jobs and keep your kids safe, head to GuardianBikes.com today.
[00:12:48.600 --> 00:12:50.920] You'll save hundreds compared to the competition.
[00:12:50.920 --> 00:12:51.880] Join their newsletter.
[00:12:51.880 --> 00:12:55.000] You get a free bike lock and pump, a $50 value.
[00:12:55.000 --> 00:12:59.480] Guardian Bikes built in the USA, made specifically for kids.
[00:13:00.520 --> 00:13:10.920] That show people what the app thinks they want to see and has a notification so they go back as often as possible and spend as much time as possible.
[00:13:10.920 --> 00:13:12.680] So TikTok is a good example.
[00:13:12.680 --> 00:13:23.080] It's known as having the stickiest algorithm out there, which is probably why so many kids and teens spend hours and hours and hours on TikTok.
[00:13:23.080 --> 00:13:25.480] And that time sink has its issues.
[00:13:25.480 --> 00:13:31.720] The other thing is because that algorithm is showing them what it thinks they want to see.
[00:13:31.720 --> 00:13:35.240] It's showing them more of whatever they watch and whatever they linger on.
[00:13:35.240 --> 00:13:39.160] And think about what that's going to be for the average teen.
[00:13:39.160 --> 00:13:41.400] It's going to be stuff that's inappropriate.
[00:13:41.400 --> 00:13:42.520] It's going to be violence.
[00:13:42.520 --> 00:13:47.720] It's going to be just things that they're going to look at a little bit longer and then it's going to serve them up more of that.
[00:13:47.720 --> 00:14:01.480] And then what if you have a teen in a bad mood, which does happen, happens frequently, then they're sad, they're depressed, they watch stuff that's about sadness and depression, and then it goes down this rabbit hole of darkness.
[00:14:01.800 --> 00:14:02.600] Wow.
[00:14:02.920 --> 00:14:15.840] So, for those of us who don't know how the algorithms work, is it something like when I go on Amazon because I'm an author and I buy a book, then it pushes to me related books, which I like because I'm often interested in those particular topics.
[00:14:15.840 --> 00:14:17.040] So, it's something like that.
[00:14:14.760 --> 00:14:21.920] It's just automated and it's designed to just push stuff that you've already clicked on or something like that.
[00:14:22.240 --> 00:14:25.680] Yeah, so it shows you stuff that's related.
[00:14:25.680 --> 00:14:34.160] And, you know, when it's say a movie on Netflix or a book on Amazon, that may have its positives.
[00:14:34.160 --> 00:14:41.920] But when it's videos, especially for a teen and especially for a vulnerable teen, that's where you run into trouble.
[00:14:42.160 --> 00:14:49.600] We're talking about an adult and they just watch cooking videos and they get served up more cooking videos and they can put it down after half an hour or an hour.
[00:14:49.600 --> 00:14:51.600] That's not as obviously harmful.
[00:14:51.600 --> 00:14:59.520] But then you take a kid or a teen and they don't have the frontal lobe development to put it down and close that app.
[00:14:59.840 --> 00:15:11.440] They, you know, again, are what they're drawn to, just given their stage of brain development, is to look at those things that are probably not healthy and then they get served up more of it.
[00:15:12.720 --> 00:15:17.360] Yeah, that's why I've been watching so many World War II documentaries on Amazon Prime.
[00:15:17.360 --> 00:15:18.960] It keeps feeding me those.
[00:15:19.280 --> 00:15:21.200] Every time I log in, that's all I see.
[00:15:21.200 --> 00:15:23.120] Oh, World War II in color.
[00:15:23.120 --> 00:15:23.760] The best of all.
[00:15:24.400 --> 00:15:25.360] You can say no.
[00:15:25.360 --> 00:15:26.560] Just remember that.
[00:15:26.560 --> 00:15:27.680] Oh, oh.
[00:15:28.320 --> 00:15:30.880] I don't have any self-control even at age 70.
[00:15:30.880 --> 00:15:31.360] Okay.
[00:15:31.920 --> 00:15:35.200] As for you, let's go over your four different parenting styles.
[00:15:35.200 --> 00:15:36.800] I recognize myself with one of these.
[00:15:36.800 --> 00:15:39.040] Uninvolved parenting, that's not me.
[00:15:39.040 --> 00:15:41.200] Permissive parenting, that's me.
[00:15:41.200 --> 00:15:42.400] You know, so this is the thing.
[00:15:42.400 --> 00:15:44.400] I'm kind of non-confrontational.
[00:15:44.400 --> 00:15:45.760] I'm a softy.
[00:15:45.760 --> 00:15:50.080] Really, my sons, the discipline mostly comes from my German wife.
[00:15:50.400 --> 00:15:51.600] We're always on time.
[00:15:51.600 --> 00:15:53.200] The iPad is going down.
[00:15:53.200 --> 00:15:58.240] And he's like, if I'm there and my wife's gone, and he's like, dad, just five more minutes.
[00:15:58.240 --> 00:15:58.640] Okay.
[00:15:59.040 --> 00:15:59.960] Because I feel bad.
[00:15:59.960 --> 00:16:02.680] It's like, oh, it makes him so happy to do this.
[00:16:02.840 --> 00:16:03.720] I don't want to stop.
[00:15:59.760 --> 00:16:03.880] All right.
[00:16:03.960 --> 00:16:07.400] So, what do you say to us permissive parents?
[00:16:08.040 --> 00:16:11.480] Well, get disciplined.
[00:16:11.480 --> 00:16:12.680] Yes, get disciplined.
[00:16:12.680 --> 00:16:16.360] Now, look, I will admit I have sympathy for that approach.
[00:16:16.360 --> 00:16:18.600] I don't like fighting with my kids either.
[00:16:18.600 --> 00:16:19.800] I hate it.
[00:16:20.280 --> 00:16:24.360] But I have come to realize that it's necessary and that B.F.
[00:16:24.440 --> 00:16:25.880] Skinner was right.
[00:16:25.880 --> 00:16:31.160] That, you know, you do have to be a behaviorist as a parent because it's the only thing that works.
[00:16:31.160 --> 00:16:34.920] So rewards and punishments and more rewards than punishments.
[00:16:34.920 --> 00:16:36.440] It really is the only thing that works.
[00:16:36.440 --> 00:16:39.160] That's my conclusion after three kids.
[00:16:40.120 --> 00:16:49.080] So the problem with being too permissive is that your kid may end up being a terror.
[00:16:49.080 --> 00:16:54.040] Now, it sounds like you have a disciplinarian in your house, so that's helpful.
[00:16:54.840 --> 00:17:02.600] But when both parents are permissive and there's just very few rules, then it's hard for kids to adjust to school.
[00:17:02.600 --> 00:17:05.880] It's hard for them to adjust to college, to the workplace.
[00:17:07.640 --> 00:17:11.320] They don't have those boundaries, that structure in place.
[00:17:11.320 --> 00:17:13.240] And kids need that.
[00:17:13.240 --> 00:17:20.120] Now, they don't necessarily need the other extreme of really authoritarian, it's my way or the highway.
[00:17:20.120 --> 00:17:20.680] These are the rules.
[00:17:20.680 --> 00:17:21.880] I'm not going to explain them.
[00:17:21.880 --> 00:17:23.880] And I'm not even going to really show you much love.
[00:17:23.880 --> 00:17:25.400] We don't want to do that either.
[00:17:25.400 --> 00:17:28.120] So the happy medium is loving but firm.
[00:17:28.120 --> 00:17:31.000] So in psychology, it's called an authoritative.
[00:17:31.320 --> 00:17:33.160] And I call it dolphin parenting.
[00:17:33.160 --> 00:17:34.200] That's actually not my term.
[00:17:34.200 --> 00:17:36.680] I've seen it elsewhere, the whole book on dolphin parenting.
[00:17:36.680 --> 00:17:38.520] But it's the idea of firm but flexible.
[00:17:38.520 --> 00:17:42.920] That's an analogy my kids would call cringe, but it's useful.
[00:17:42.920 --> 00:17:45.280] And I like the idea of like, oh, I like sea animals.
[00:17:44.520 --> 00:17:47.120] Let's use sea animals for all of them.
[00:17:44.920 --> 00:17:49.200] So the completely uninvolved is fish.
[00:17:49.280 --> 00:17:53.040] The permissive is the you're a sea sponge, apparently.
[00:17:53.440 --> 00:17:56.160] And then the authoritarian is like the tiger shark.
[00:17:56.160 --> 00:17:58.640] Like you remember the whole like tiger mom, tiger parenting?
[00:17:58.800 --> 00:17:59.360] Yes, yeah.
[00:17:59.600 --> 00:18:05.280] Like, I you have to do X, Y, and Z because I want you to get into Harvard and be a Prodigy, and that's it.
[00:18:05.280 --> 00:18:07.120] And I'm not ever backing down.
[00:18:07.760 --> 00:18:11.280] Now, you know, I think she also showed love and so on.
[00:18:11.280 --> 00:18:16.080] So she wasn't, you know, as extreme as I think sometimes she was made out to be in that book.
[00:18:16.080 --> 00:18:17.920] But you talked about Amy?
[00:18:17.920 --> 00:18:18.960] Yeah, Amy Cowell.
[00:18:18.960 --> 00:18:19.200] Yeah.
[00:18:19.200 --> 00:18:19.680] I know her.
[00:18:19.680 --> 00:18:20.400] She's a good friend.
[00:18:20.400 --> 00:18:20.720] Yeah.
[00:18:20.960 --> 00:18:21.440] Yeah.
[00:18:21.600 --> 00:18:21.920] Yeah.
[00:18:21.920 --> 00:18:23.920] No, I thought the book was amazing.
[00:18:24.400 --> 00:18:26.880] I'm not trying to throw her under the bus by any means.
[00:18:26.880 --> 00:18:32.480] It's just that was the type of parenting that kind of came to mind.
[00:18:32.480 --> 00:18:35.760] And you think about the authoritarian of just like, that's it.
[00:18:36.640 --> 00:18:38.000] These are the rules.
[00:18:38.000 --> 00:18:40.960] And I mean, it's like the 1950s dad.
[00:18:40.960 --> 00:18:41.440] Yeah.
[00:18:41.680 --> 00:18:43.840] The really harsh parenting.
[00:18:43.840 --> 00:18:45.360] And I'm not going to play with you.
[00:18:45.360 --> 00:18:47.200] And I'm not really going to talk to you like a human being.
[00:18:47.200 --> 00:18:48.320] You just do what I say.
[00:18:48.400 --> 00:18:50.080] We don't want to go that far.
[00:18:50.400 --> 00:18:53.120] So that loving but firm really does work.
[00:18:53.120 --> 00:18:56.160] You know, you spend time with your kids, you talk to them.
[00:18:56.160 --> 00:19:02.800] But when you have rules, yeah, sure, maybe sometimes you can make a few exceptions, but the rules are the rules and you stick to them.
[00:19:02.800 --> 00:19:11.440] And that was part of the inspiration for this book: that happy medium of, yeah, you're going to talk to kids, but you also need to have guidelines in place.
[00:19:11.440 --> 00:19:12.880] Yeah, definitely.
[00:19:12.880 --> 00:19:13.680] Kids need that.
[00:19:13.680 --> 00:19:14.880] So that's under rule one.
[00:19:14.880 --> 00:19:15.920] You're in charge.
[00:19:15.920 --> 00:19:16.640] Be in charge.
[00:19:16.640 --> 00:19:18.320] Be firm but loving, and so on.
[00:19:18.320 --> 00:19:21.280] Number two, no electronic devices in the bedroom overnight.
[00:19:21.280 --> 00:19:22.240] We talked about that, right?
[00:19:22.240 --> 00:19:30.120] So, literally, just go into your kids' bedroom and just take out the iPad, the phone, whatever, the computer, and just put it in your own bedroom.
[00:19:30.120 --> 00:19:33.080] Yeah, well, there's a number of ways I go into all the different ways to do it.
[00:19:29.600 --> 00:19:35.400] And in my house, they go downstairs on the kitchen counter.
[00:19:35.640 --> 00:19:39.160] If I had any inkling they were using it in the middle of the night, I would lock it up.
[00:19:39.160 --> 00:19:40.280] That'd be the other thing.
[00:19:40.280 --> 00:19:47.160] So, you can put it in your bedroom, but if you sleep later than your kids, then that's a problem.
[00:19:47.160 --> 00:19:55.080] Like, you know, you have a 16-year-old driving herself to school, then that may not work if she's getting up earlier than you or staying up later than you, you know.
[00:19:55.080 --> 00:19:57.480] So, it may or may not work.
[00:19:57.720 --> 00:20:00.920] I have heard a lot of parents, that's how they do it, though.
[00:20:01.240 --> 00:20:04.680] Okay, then what do you get around the problem of the objection for the kids?
[00:20:04.680 --> 00:20:08.920] But all my friends are using it and I want to talk to them or whatever.
[00:20:09.640 --> 00:20:10.520] Right.
[00:20:10.840 --> 00:20:23.480] Well, the reason that number two of no devices in the bedroom overnight is the one I suggest you follow, even if you can't follow anything else, is yes, you know, you may want to talk to your friends, but you're not going to talk to your friends at 2 a.m.
[00:20:23.960 --> 00:20:24.840] Nope.
[00:20:24.840 --> 00:20:29.480] Non-starter because sleep is just so crucial for physical and mental health.
[00:20:29.480 --> 00:20:31.480] Not going to be something we do in this house.
[00:20:31.480 --> 00:20:37.400] So, I think that's something that even a C-Sponge parent can get behind, right?
[00:20:38.520 --> 00:20:43.640] So, but the overall argument of I'm going to be the only one.
[00:20:43.960 --> 00:20:52.440] So, sometimes, and I'm half joking and half not, I say that if your kid is the only one who doesn't have a smartphone or doesn't have social media, it means you won.
[00:20:52.760 --> 00:20:57.160] However, kids are still going to say that they are going to push back on that.
[00:20:57.160 --> 00:21:00.520] You know, I actually haven't found that to be too much of a problem.
[00:21:00.520 --> 00:21:03.240] And a lot of other parents I've talked to have said the same.
[00:21:03.240 --> 00:21:07.960] And I think it's because there are these solutions.
[00:21:07.960 --> 00:21:14.440] So, for one thing, your kid doesn't really need to have social media to communicate with their friends, they just don't.
[00:21:14.440 --> 00:21:17.040] There's so many other ways that they can communicate with their friends.
[00:21:14.920 --> 00:21:21.840] They can call, they can text, they can FaceTime, they can get together in person, they could go on a game.
[00:21:22.160 --> 00:21:24.800] You know, it doesn't have to be social media.
[00:21:24.800 --> 00:21:28.320] We're talking about TikTok and how it has that very sticky algorithm.
[00:21:28.320 --> 00:21:31.200] TikTok is not a way that teens communicate with their friends.
[00:21:31.200 --> 00:21:34.800] It's a way they see viral videos, which they didn't talk to their friends about.
[00:21:34.800 --> 00:21:42.640] But look, if the video is that viral, maybe their friend can show it to them on their phone, or maybe they can see it on their laptop briefly or something like that.
[00:21:42.640 --> 00:21:46.320] It doesn't, they don't have to be on TikTok themselves.
[00:21:46.640 --> 00:21:49.200] They can communicate in so many other ways.
[00:21:49.200 --> 00:21:58.800] And then, in terms of the smartphone, you know, I really think, so the rule that I say is give the first smartphone when they get their driver's license.
[00:21:58.800 --> 00:22:07.200] So, in most states, that's going to be 16, plus they have to get that all-important DMV appointment, which is apparently hard to get, my 15-year-old tells me.
[00:22:08.480 --> 00:22:13.120] But they get that, they get the license, then they can have the internet-enabled phone.
[00:22:13.120 --> 00:22:19.040] Before that, give them something, another type of phone they could text their friends on, but that does not have social media.
[00:22:19.040 --> 00:22:24.480] And very important for today's conversation, also, does not have AI chatbots.
[00:22:24.480 --> 00:22:27.760] Oh, so that was something that was in its infancy when I was writing this book.
[00:22:27.760 --> 00:22:35.840] Even though I wrote this book faster than any book I've ever written, now that's the other big issue parents have to be aware of: these AI boyfriends and girlfriends.
[00:22:35.840 --> 00:22:36.960] All right.
[00:22:37.280 --> 00:22:41.520] But chatbots, you mean like ChatGPT and Grok and those?
[00:22:41.840 --> 00:22:44.000] So there's different types.
[00:22:44.000 --> 00:22:52.880] So ChatGPT and Grok are kind of the general ones, and those have their own issues around kids using them to write essays and things like that.
[00:22:53.520 --> 00:23:05.000] But what is particularly concerning to me are the apps on phones that are AI apps specifically designed to be companions, often romantic companions.
[00:23:05.320 --> 00:23:08.680] And you think about this, you think about your kid, like think about your son.
[00:23:08.680 --> 00:23:14.600] Do you want his first experience with a quote romantic relationship to be with a chatbot?
[00:23:14.600 --> 00:23:15.080] No.
[00:23:15.720 --> 00:23:16.360] And right.
[00:23:16.360 --> 00:23:18.920] I mean, absolutely not, right?
[00:23:20.120 --> 00:23:22.120] They're good enough to fool people?
[00:23:22.920 --> 00:23:32.680] Well, they are good enough for young, for children and young teens to want to interact with them.
[00:23:32.680 --> 00:23:34.520] And there's all kinds of them out there.
[00:23:34.520 --> 00:23:38.280] So the band apps list on one of my kids' basic phones.
[00:23:38.280 --> 00:23:41.800] Yeah, they have a list of all the things that they just never allow.
[00:23:41.800 --> 00:23:45.640] And there's sexy chat, and there's the AI boyfriends and girlfriends.
[00:23:45.640 --> 00:23:49.800] And there's a lot of these specialized apps.
[00:23:49.800 --> 00:23:50.440] And they're new.
[00:23:50.440 --> 00:23:52.600] They're new in the last six months or so.
[00:23:52.600 --> 00:23:53.720] I haven't even followed that.
[00:23:54.120 --> 00:23:57.400] Do these kids know who they're actually talking to?
[00:23:57.400 --> 00:24:01.320] And it's not a person, but it's just so addicted they can't stop.
[00:24:01.320 --> 00:24:06.360] The AI has gotten so good that it feels like they're talking to someone.
[00:24:06.360 --> 00:24:16.280] And it's, it's, I mean, think about it this way: you know, the idea of teens like feeling like they have a relationship with the TV.
[00:24:16.280 --> 00:24:22.520] A mochi moment from Mark, who writes, I just want to thank you for making GOP1s affordable.
[00:24:22.520 --> 00:24:27.480] What would have been over $1,000 a month is just $99 a month with Mochi.
[00:24:27.480 --> 00:24:29.880] Money shouldn't be a barrier to healthy weight.
[00:24:29.880 --> 00:24:33.160] Three months in, and I have smaller jeans and a bigger wallet.
[00:24:33.160 --> 00:24:34.440] You're the best.
[00:24:34.440 --> 00:24:35.480] Thanks, Mark.
[00:24:35.480 --> 00:24:38.360] I'm Myra Ammeth, founder of Mochi Health.
[00:24:38.360 --> 00:24:42.280] To find your Mochi moment, visit joinmochi.com.
[00:24:42.280 --> 00:24:44.880] Mark is a mochi member compensated for his story.
[00:24:45.680 --> 00:24:48.160] Character or a celebrity?
[00:24:48.160 --> 00:24:49.600] That's happened for decades.
[00:24:49.600 --> 00:24:51.840] You remember the girl screaming for the Beatles?
[00:24:44.760 --> 00:24:52.000] Yes.
[00:24:52.960 --> 00:24:53.200] Right?
[00:24:53.200 --> 00:24:56.720] And they would decide, Am I a Paul girl or a Rico girl?
[00:24:56.880 --> 00:24:57.760] You know what I mean?
[00:24:57.760 --> 00:25:02.560] And so that kind of parasocial, or there's a term for this, I'm forgetting, but it's something like that.
[00:25:03.280 --> 00:25:18.240] That's that happens a lot at that age because they may not feel like they're ready for a real relationship, so they kind of practice, but they're practicing with an AI chatbot that is talking to them, always affirms everything.
[00:25:18.560 --> 00:25:23.200] It, you know, if you want it to be sexual, it'll be sexual all the time.
[00:25:23.200 --> 00:25:28.080] And what standard does that set for them having a relationship with an actual human being?
[00:25:28.080 --> 00:25:29.360] Not a good one.
[00:25:29.680 --> 00:25:35.360] No, because probably most people don't want to talk about sex all day like you do with your chatbot.
[00:25:35.360 --> 00:25:36.000] Okay, that's weird.
[00:25:36.000 --> 00:25:37.440] I didn't know about that one.
[00:25:38.560 --> 00:25:43.360] Okay, so these basic phones, I don't know anything about them.
[00:25:43.360 --> 00:25:46.800] These are like the old Motorola flip phones kind of thing?
[00:25:46.800 --> 00:25:47.360] Sort of.
[00:25:47.360 --> 00:25:50.400] Okay, you have a nine-year-old, so you, you are my target audience.
[00:25:50.400 --> 00:25:50.880] Yes, what do I do?
[00:25:53.040 --> 00:25:59.280] So the good news is the choice is not flip phone versus smartphone anymore.
[00:25:59.600 --> 00:26:16.080] So if you have a kid who's like, yeah, say 9, 10, 11, and they have a bus stop that's far away or they do club sports and there's issues around pickups or something and you just need really simple like texting, calling, you could get them a flip phone, you could give them a watch, although those have downsides.
[00:26:16.080 --> 00:26:17.840] We could get into that in a sec.
[00:26:19.200 --> 00:26:27.040] Once they get to the stage, around middle school, like 12 or so, they're probably going to want to text their friends.
[00:26:27.040 --> 00:26:31.480] Now, they can start out doing that from your phone, but that gets old after a while.
[00:26:31.480 --> 00:26:33.080] That's what we found.
[00:26:33.080 --> 00:26:36.440] So, we got our younger two kids these basic phones.
[00:26:36.440 --> 00:26:39.160] There's a bunch of different types, but they're not flip phones anymore.
[00:26:39.160 --> 00:26:44.440] They're usually Android or Google phones, and they are designed for kids.
[00:26:44.440 --> 00:26:48.120] So, a couple of examples: Gab, TrueMe, Pinwheel.
[00:26:48.120 --> 00:26:50.520] They have a parent portal.
[00:26:50.520 --> 00:27:02.840] And the basic principle is you can text, you can call, but generally, there is no internet browser, no social media, and no AI chatbots, no gambling apps.
[00:27:02.840 --> 00:27:05.720] You know, they have a whole list of the band apps.
[00:27:05.720 --> 00:27:10.520] So, you have the control as a parent to say, okay, you can have Duolingo.
[00:27:11.080 --> 00:27:18.600] Or this was a whole thing, but we told our 13-year-old, yeah, you can have Spotify, even though I had my doubts about it.
[00:27:18.600 --> 00:27:23.960] But she's a Swifty, she loves Taylor Swift, she wants to listen to Taylor Swift all the time on the bus.
[00:27:24.280 --> 00:27:26.200] And I'm like, she could do worse.
[00:27:26.200 --> 00:27:27.160] That works for me.
[00:27:27.160 --> 00:27:28.040] That's cool.
[00:27:29.320 --> 00:27:30.920] And so she really wanted Spotify.
[00:27:30.920 --> 00:27:39.880] And we tried other music apps that were, you know, maybe didn't have as many potential issues, but we couldn't get them to work on the phone.
[00:27:39.880 --> 00:27:41.640] So she has Spotify.
[00:27:41.640 --> 00:27:44.440] But she really doesn't have a whole lot else, like a couple of games.
[00:27:44.440 --> 00:27:46.040] She's got Duolingo.
[00:27:46.040 --> 00:27:47.720] And that's really it.
[00:27:47.720 --> 00:27:50.200] And so she's not getting on online.
[00:27:50.200 --> 00:27:57.560] And then we have to worry about her, you know, going to Pornhub, which, by the way, do you know how you get on Pornhub in most states, including California?
[00:27:58.120 --> 00:27:59.240] No, I've never even heard.
[00:27:59.320 --> 00:27:59.960] What is this called?
[00:27:59.960 --> 00:28:00.440] Pornhub?
[00:28:00.440 --> 00:28:01.080] I've never heard of it.
[00:28:01.560 --> 00:28:02.360] No, I've just kidding.
[00:28:02.680 --> 00:28:03.160] I'm kidding.
[00:28:03.160 --> 00:28:04.440] I'm kidding, everybody.
[00:28:05.400 --> 00:28:06.760] No, go ahead.
[00:28:07.080 --> 00:28:08.600] You click I am 18, enter.
[00:28:08.760 --> 00:28:09.080] That's it.
[00:28:09.320 --> 00:28:09.800] Yeah, okay.
[00:28:09.800 --> 00:28:10.920] So that's it.
[00:28:10.920 --> 00:28:11.320] Yeah.
[00:28:12.040 --> 00:28:17.520] The government is just not going to regulate these companies with this phony I'm 18 button.
[00:28:14.840 --> 00:28:18.000] Come on.
[00:28:18.560 --> 00:28:25.280] Well, Texas tried, and Pornhub pulled out of the state, and then the Supreme Court upheld the challenge to the Texas law.
[00:28:25.280 --> 00:28:28.160] So with pornography, maybe we might get some progress.
[00:28:28.160 --> 00:28:30.880] Social media, I'm not as sure will happen.
[00:28:30.880 --> 00:28:34.240] For now, in most states, it's still up to parents.
[00:28:34.960 --> 00:28:45.040] And so that's another reason to not give them that smartphone and to use parental controls when you do give them the smartphone and on the laptop and so on.
[00:28:45.040 --> 00:28:48.000] Yeah, well, you make the point, you know, your book is mainly designed for parents.
[00:28:48.000 --> 00:28:50.960] So this is bottom-up solutions to the problem.
[00:28:50.960 --> 00:28:58.560] What about regular old top-down government regulation of like smoking and drinking and voting and driving and going to war and everything else?
[00:28:58.560 --> 00:28:59.680] Why are they not?
[00:28:59.680 --> 00:29:03.760] Is it in the pockets of these huge tech companies that just have lobbyists?
[00:29:04.960 --> 00:29:05.760] Maybe.
[00:29:05.760 --> 00:29:07.760] I mean, look, I'm not an attorney.
[00:29:07.760 --> 00:29:11.440] I know there's legal arguments around this too, around the First Amendment and other things.
[00:29:12.160 --> 00:29:13.840] I know that's out there.
[00:29:13.840 --> 00:29:20.480] However, we're not talking about, well, we are.
[00:29:20.960 --> 00:29:24.960] We're talking about the problem of kids.
[00:29:24.960 --> 00:29:28.800] And again, not a lawyer.
[00:29:29.120 --> 00:29:38.320] I know that there's a lot of the arguments of the First Amendment and it's going to, you know, hinge on First Amendment rights and so on.
[00:29:38.320 --> 00:29:41.120] But come on, there's got to be a way that we can do this.
[00:29:41.120 --> 00:29:45.440] Maybe it's really simple age verification when you download an app.
[00:29:45.440 --> 00:29:47.840] That wouldn't be too intrusive to adults.
[00:29:47.840 --> 00:29:51.040] We can grandfather it in so you don't have to do it now.
[00:29:51.040 --> 00:29:54.720] And then that will protect the kids and young teens of the future.
[00:29:54.720 --> 00:29:59.560] So we've got to do something about age verification.
[00:29:59.560 --> 00:30:00.520] Like a driver's license.
[00:29:59.600 --> 00:30:03.320] Like at airports, now they just scan your driver's license.
[00:30:03.320 --> 00:30:03.720] Right.
[00:29:59.840 --> 00:30:05.480] So that's the obvious solution.
[00:30:05.800 --> 00:30:11.800] And then some people have qualms about that, but there are so many companies that do age verification now.
[00:30:11.800 --> 00:30:14.120] They have their own trade association.
[00:30:14.120 --> 00:30:16.680] And a lot of the techniques are not government ID.
[00:30:16.680 --> 00:30:17.400] They're other things.
[00:30:17.880 --> 00:30:20.520] It's some AI techniques or other things.
[00:30:20.520 --> 00:30:23.240] So this is a little outside my realm of expertise.
[00:30:23.240 --> 00:30:26.680] You know, I'm not a coder or age verification provider.
[00:30:26.680 --> 00:30:31.880] But point being, there's a lot of different ways it can be done.
[00:30:32.200 --> 00:30:35.720] It's just that you need to get the political will to get the regulation in place.
[00:30:35.720 --> 00:30:37.800] Because right now it's pretty much all up to parents.
[00:30:37.800 --> 00:30:43.400] And that's so, it puts parents in such a difficult, almost impossible position.
[00:30:43.400 --> 00:30:46.600] I mean, I wrote the book so parents know what they can do.
[00:30:46.600 --> 00:30:47.000] Yeah.
[00:30:47.000 --> 00:30:51.640] But I'm also the first to say, yeah, look, you know, it shouldn't all be on us.
[00:30:51.640 --> 00:30:53.320] It shouldn't be this time consuming.
[00:30:53.320 --> 00:30:55.720] It shouldn't take this much effort and money.
[00:30:55.880 --> 00:30:59.560] We should have a government that will protect our kids from this stuff.
[00:30:59.560 --> 00:31:04.280] Okay, I know you got to run to your next interview because you're on your book tour, but it's a collective action problem.
[00:31:04.280 --> 00:31:06.520] I've tracked this for a few years.
[00:31:06.520 --> 00:31:16.440] You and Jonathan Haidt and Lenore Skinese and others seem to have had some effect already with like schools now starting to ban phones during the school hours.
[00:31:16.440 --> 00:31:17.720] That's good.
[00:31:17.720 --> 00:31:19.320] Yeah, absolutely.
[00:31:19.320 --> 00:31:21.480] And so I hope that will continue.
[00:31:21.480 --> 00:31:24.120] But to your point, it is a collective action problem.
[00:31:24.120 --> 00:31:28.280] That issue we brought up earlier of kids saying, but I'm the only one.
[00:31:28.600 --> 00:31:32.760] Let's imagine a world where, like, Australia is trying to do this.
[00:31:32.760 --> 00:31:34.600] They already have this law passed.
[00:31:34.600 --> 00:31:40.120] We're going to raise the minimum age for social media to 16, and we're actually going to verify age.
[00:31:40.120 --> 00:31:44.560] Then, no parent with a kid 15 or under would have that problem.
[00:31:44.560 --> 00:31:46.960] But, but all my friends are on it, right?
[00:31:48.240 --> 00:31:50.560] All my friends are drinking, right?
[00:31:44.920 --> 00:31:53.520] I know, and that's that's what really gets me going, to be honest.
[00:31:53.520 --> 00:31:58.880] When people are like, Well, you know, oh, there's no one age, and every kid is different.
[00:31:58.880 --> 00:32:01.600] You know, why are you saying a specific age?
[00:32:01.920 --> 00:32:07.520] Really, are we going to say, you know, why didn't we say, oh, some 12-year-olds are ready to drive, and some 20-year-olds aren't?
[00:32:07.600 --> 00:32:08.640] It just depends on the kid.
[00:32:08.720 --> 00:32:14.160] No, we picked an age, and we stuck with it and we enforce it, and we should do the same here.
[00:32:14.160 --> 00:32:18.080] And then free range, get them outside, yes, yeah, I know.
[00:32:18.080 --> 00:32:21.200] So, that's that's rule eight, and I'm a huge believer in that too.
[00:32:21.280 --> 00:32:27.200] Huge fan of uh Lenora Skinese and uh her let-go organization.
[00:32:27.200 --> 00:32:38.800] I have a whole list of things in that chapter that you can do with kids 12 and under and 13 and up to teach them real-world skills, to give them independence, to give them, you know, tasks around the house.
[00:32:38.800 --> 00:32:45.760] Um, my favorite example is have them cook dinner every once in a while, then you don't have to cook dinner that night.
[00:32:45.760 --> 00:32:46.640] That's a good idea.
[00:32:46.720 --> 00:32:47.360] Love it.
[00:32:47.680 --> 00:32:52.960] I love public playgrounds, just go out and watch it's fun to just watch them under sunshine and air.
[00:32:53.440 --> 00:32:53.600] Exactly.
[00:32:53.760 --> 00:32:54.640] Yeah, what a difference.
[00:32:54.640 --> 00:32:56.640] Yeah, all right, thanks, Gene.
[00:32:56.640 --> 00:32:58.160] Thank you very much.
[00:33:15.760 --> 00:33:18.800] Tescarca laptop de DraftKings Sportsbook y user.
[00:33:18.680 --> 00:33:19.760] That's the Codego Hart.
[00:33:18.840 --> 00:33:25.400] This el Code go hard for receiving sientos dolores in bones pets at the instante when ages to first puesta de 5 dollars.
[00:33:25.400 --> 00:33:26.880] In collaboración with DraftKings.
[00:33:26.880 --> 00:33:27.920] La corona estu ya.
[00:33:28.000 --> 00:33:30.440] Problemas con el juego, yamal unaucho sientos gambler.
[00:33:28.960 --> 00:33:32.760] En Uvayor, yamal ocho si testiciete ocho hope and y.
[00:33:29.360 --> 00:33:36.360] On di unmensaje de texto con la palabra hope and y, alcatro seiete 36 nueve.
[00:33:36.520 --> 00:33:38.840] In Connecticut, Iuda disponible para problemas con el Juego.
[00:33:38.920 --> 00:33:43.240] Yamal Ocho Ocho Ocho, Ciete Ocho Nueve, Ci Te Ciete, Ciete Ciete, Ovisita C Peje, Punto Org.
[00:33:43.320 --> 00:33:47.720] Huega responsablente, en nomre de Budho casino en resort, Kansas, dercer mayor de 20 uno años.
[00:33:47.720 --> 00:33:49.000] Pueden applicarce cargos en Illinois.
[00:33:49.080 --> 00:33:51.880] La elhibilida varías segun la la juristición, no valido en Ontario.
[00:33:51.880 --> 00:33:53.720] Las apuestas de bonos zonde un sol uso.
[00:33:53.720 --> 00:33:56.840] La puesta, nos inclu y las canancias y caducan despueste ser otorgadas.
[00:33:56.840 --> 00:34:01.400] La offerta finalis el 2vinue de septiempre 2vi 15o a las unces encentenue de pene or a del este.
[00:34:01.400 --> 00:34:04.360] Los terminos en draft kings punto com para sportspook para promos.
[00:34:04.360 --> 00:34:06.120] Marketing is hard.
[00:34:06.440 --> 00:34:07.720] But I'll tell you a little secret.
[00:34:07.720 --> 00:34:08.520] It doesn't have to be.
[00:34:08.520 --> 00:34:09.800] Let me point something out.
[00:34:09.800 --> 00:34:12.120] You're listening to a podcast right now, and it's great.
[00:34:12.120 --> 00:34:12.920] You love the host.
[00:34:12.920 --> 00:34:14.200] You seek it out and download it.
[00:34:14.200 --> 00:34:18.040] You listen to it while driving, working out, cooking, even going to the bathroom.
[00:34:18.040 --> 00:34:20.840] Podcasts are a pretty close companion.
[00:34:20.840 --> 00:34:22.520] And this is a podcast ad.
[00:34:22.520 --> 00:34:23.960] Did I get your attention?
[00:34:23.960 --> 00:34:28.680] You can reach great listeners like yourself with podcast advertising from LibSyn Ads.
[00:34:28.680 --> 00:34:38.600] Choose from hundreds of top podcasts offering host endorsements or run a pre-produced ad like this one across thousands of shows to reach your target audience and their favorite podcasts with LibSyn ads.
[00:34:38.600 --> 00:34:40.200] Go to libsynads.com.
[00:34:40.200 --> 00:34:44.040] That's L-I-B-S-Y-N ads.com today.
Prompt 2: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 3: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Full Transcript
[00:00:00.880 --> 00:00:08.720] A mochi moment from Tara, who writes, For years, all my doctor said was eat less and move more, which never worked.
[00:00:08.720 --> 00:00:10.080] But you know what does?
[00:00:10.080 --> 00:00:13.280] The simple eating tips from my nutritionist at Mochi.
[00:00:13.280 --> 00:00:19.920] And after losing over 30 pounds, I can say you're not just another GLP1 source, you're a life source.
[00:00:19.920 --> 00:00:21.040] Thanks, Tara.
[00:00:21.040 --> 00:00:23.600] I'm Myra Ammet, founder of Mochi Health.
[00:00:23.600 --> 00:00:27.360] To find your mochi moment, visit joinmochi.com.
[00:00:27.360 --> 00:00:30.880] Tara is a mochi member compensated for her story.
[00:00:31.200 --> 00:00:36.800] 16 years from today, Greg Gerstner will finally land the perfect cannonball.
[00:00:37.120 --> 00:00:52.000] Epic Splash, Unsuspecting Friends, a work of art only possible because Greg is already meeting all these same people at AARP volunteer and community events that keep him active and involved and help make sure his happiness lives as long as he does.
[00:00:52.000 --> 00:00:55.920] That's why the younger you are, the more you need AARP.
[00:00:55.920 --> 00:00:59.760] Learn more at aarp.org/slash local.
[00:01:03.920 --> 00:01:09.600] You're listening to The Michael Shermer Show.
[00:01:16.960 --> 00:01:21.120] All right, hey, everybody, it's Michael Shermer, and it's time for another episode of the Michael Shermer Show.
[00:01:21.120 --> 00:01:40.640] My returning champion guest today is Jean Twangy, a professor of psychology at San Diego State University, and the author of more than 190 scientific publications and several books based on her research, including Generation Me, iGen, and Generations, which we discussed on the show.
[00:01:40.640 --> 00:01:42.000] That was her previous book.
[00:01:42.000 --> 00:01:48.560] Her research has been covered in Time, The Atlantic, Newsweek, The New York Times, USA Today, and The Washington Post.
[00:01:48.560 --> 00:01:56.240] She's been featured on The Today Show, Good Morning America, Fox and Friends, CBS, This Morning, Real Time with Bill Maher and NPR.
[00:01:56.240 --> 00:02:06.840] She lives in San Diego with her husband and three daughters, so she knows from what she speaks about her new book, 10 Rules for Raising Kids in a High-Tech World.
[00:02:06.840 --> 00:02:08.280] All right, Gene, nice to see you again.
[00:02:08.280 --> 00:02:09.240] How are you?
[00:02:09.240 --> 00:02:09.800] I'm good.
[00:02:09.800 --> 00:02:10.520] How are you doing?
[00:02:10.520 --> 00:02:11.800] You're on your book tour this week.
[00:02:11.800 --> 00:02:18.040] I have to tell you, I don't know how many times you've ever heard my show, but I plug your generations book all the time.
[00:02:18.040 --> 00:02:31.720] It is one of the most important books I have ever read, honestly, because you have a theoretical model overarching all the specifics that kind of tie them together, namely this life, what is it, life history theory that I really liked.
[00:02:31.720 --> 00:02:32.920] It explains a lot.
[00:02:32.920 --> 00:02:33.080] Right.
[00:02:33.400 --> 00:02:34.200] It explains a lot.
[00:02:34.200 --> 00:02:39.960] And since your book came out, I think it's pretty clear that the problem is no longer debatable.
[00:02:39.960 --> 00:02:41.080] It really is a problem.
[00:02:41.080 --> 00:02:51.880] The link between social media technology and the spike in depression, anxiety, suicidal ideation, unhappiness, and so on, right?
[00:02:52.520 --> 00:02:53.880] Yeah, absolutely.
[00:02:53.880 --> 00:03:02.440] And as you know, you know, it started with seeing those troubling trends in adolescent depression and then wondering what caused them.
[00:03:02.440 --> 00:03:08.120] So, you know, that was back in 2017 when I first started to make that argument.
[00:03:08.120 --> 00:03:11.560] And just because nothing else really seemed to fit.
[00:03:11.560 --> 00:03:14.200] And in the eight years since, that's still true.
[00:03:14.200 --> 00:03:20.200] Really looks like that is the primary cause, which then begs the question, what do we do about it?
[00:03:20.200 --> 00:03:20.680] Yeah.
[00:03:21.160 --> 00:03:23.960] Let me just read a couple of these stories from your book here.
[00:03:24.120 --> 00:03:30.440] When Alex Alexis Spence was 11, she opened an Instagram account on her iPad without her parents knowing.
[00:03:30.440 --> 00:03:37.400] As Alexis consumed more and more thinspiration content, I didn't know that word, thinspiration, and see where that's going.
[00:03:37.400 --> 00:03:39.000] She'd began starving herself.
[00:03:39.000 --> 00:03:46.880] By 15, she had developed a severe eating disorder and spent time in a residential treatment center as she battled anorexia and suicidal thoughts.
[00:03:47.200 --> 00:03:52.800] Alexis survived, but still struggles with mental health issues and was not able to leave home for college.
[00:03:52.960 --> 00:03:56.000] Let's just kind of deconstruct that one a bit.
[00:03:56.000 --> 00:03:58.320] What's the causal mechanism going on there?
[00:03:58.320 --> 00:04:01.760] Just comparison to other people that no one can compare to?
[00:04:02.640 --> 00:04:03.680] There's several.
[00:04:03.680 --> 00:04:06.080] There are really so many causal mechanisms.
[00:04:06.400 --> 00:04:18.240] It's why I came to realize that that explanation about smartphones and social media, you know, leading to the adolescent mental health crisis had some meat to it because there's so many possible mechanisms.
[00:04:18.240 --> 00:04:21.520] So one is just displacement.
[00:04:21.520 --> 00:04:28.400] If you're spending hours and hours and hours on social media, then you're probably not sleeping enough.
[00:04:28.400 --> 00:04:32.000] You're probably not seeing friends and family in person as much.
[00:04:32.000 --> 00:04:34.080] You're probably not getting outside.
[00:04:34.080 --> 00:04:37.440] You're not having time to just sit and think.
[00:04:37.680 --> 00:04:47.840] And then in Alexis's case, it was that social comparison of these often photoshopped bodies, airbrushed bodies on Instagram.
[00:04:47.840 --> 00:04:50.320] And who can compare to that?
[00:04:50.320 --> 00:04:50.880] No one.
[00:04:50.880 --> 00:05:01.040] And so many teen girls and young women are looking at those perfect images, those perfect bodies on Instagram, and thinking they can never live up to that.
[00:05:01.040 --> 00:05:05.680] I mean, that's what Meta's internal research found, too, when they had focus groups and surveys and so on.
[00:05:05.680 --> 00:05:09.040] They found that over and over, especially for girls and young women.
[00:05:09.040 --> 00:05:09.600] Yeah.
[00:05:09.920 --> 00:05:12.960] And speaking of that, why didn't they do something about it?
[00:05:12.960 --> 00:05:15.840] Or is it, you know, their mission is to make profits.
[00:05:15.840 --> 00:05:19.920] So what do we care what happens to their customers?
[00:05:20.240 --> 00:05:20.800] Yep.
[00:05:20.800 --> 00:05:22.400] That seems to be the case.
[00:05:22.400 --> 00:05:26.720] And that's why there's probably one of the reasons why there's a lot of lawsuits out there.
[00:05:26.720 --> 00:05:32.440] Once those internal documents got leaked, that started the flood of lawsuits.
[00:05:29.840 --> 00:05:37.000] That's actually how we know Alexis's story is that she's part of those lawsuits.
[00:05:38.600 --> 00:05:46.920] A black mailer posing as a teen girl talked a 15-year-old Utah boy into sending nude pictures of himself on Snapchat.
[00:05:46.920 --> 00:05:52.760] The user threatened to send screenshots of the images to his friends and family unless he paid $200.
[00:05:52.760 --> 00:05:55.640] He was so devastated, he took his own life.
[00:05:55.640 --> 00:05:58.520] So there's an example of it happened to boys.
[00:05:58.760 --> 00:06:06.760] One last example: Selena Rodriguez opened an Instagram account when she was 10 and before long was using it at all hours of the day and night.
[00:06:06.760 --> 00:06:09.400] When she wasn't on Instagram, she was on Snapchat.
[00:06:09.400 --> 00:06:12.680] Several adult men asked her to send nude pictures and videos of herself.
[00:06:12.680 --> 00:06:16.040] Selena was eventually hospitalized for depression and self-harm.
[00:06:16.040 --> 00:06:18.200] At the age of 11, she died by suicide.
[00:06:18.200 --> 00:06:22.840] Now, you make it clear, these are not, this doesn't always happen, but it can happen.
[00:06:23.160 --> 00:06:23.720] Right.
[00:06:23.720 --> 00:06:24.920] And I have to be clear about that.
[00:06:24.920 --> 00:06:26.840] You know, these are extreme examples.
[00:06:26.840 --> 00:06:31.800] They are really, really unfortunate outcomes, but they're also more common than you might think.
[00:06:31.800 --> 00:06:41.000] So that thing that happened to the Utah boy, sex stortion of you're going to send a nude picture and then the blackmailer asks for money.
[00:06:41.000 --> 00:06:45.960] Snapchat gets, I think it's 10,000 reports a month of this.
[00:06:45.960 --> 00:06:46.440] Oh, my God.
[00:06:46.520 --> 00:06:51.880] And the company themselves acknowledges that that's probably just the tip of the iceberg.
[00:06:51.880 --> 00:06:55.000] They don't think all these are even reported.
[00:06:55.000 --> 00:06:57.080] So these things are scary.
[00:06:57.080 --> 00:06:58.920] They are extreme.
[00:06:58.920 --> 00:07:14.880] But there are these lower-level things that happen, too, that are even more common of just kids spending so much time on social media that, as I mentioned, it interferes with other activities, that they're unhappy, that they're comparing themselves.
[00:07:14.880 --> 00:07:17.360] There's just a litany of problems.
[00:07:17.360 --> 00:07:17.760] Yeah.
[00:07:14.360 --> 00:07:21.040] Yeah, I can see how easy youngsters could fall for that.
[00:07:21.120 --> 00:07:24.560] You know, I've been getting, I don't know, the last six months or so, these random texts.
[00:07:24.560 --> 00:07:25.600] You've probably gotten them.
[00:07:25.600 --> 00:07:28.240] Hey, are we still having dinner next week?
[00:07:28.560 --> 00:07:30.320] You know, sorry, you got the wrong number.
[00:07:30.320 --> 00:07:31.600] Oh, I'm so sorry.
[00:07:31.760 --> 00:07:32.960] Maybe we could be friends.
[00:07:32.960 --> 00:07:38.320] And then before, you know, sometimes I drag it out for a while and get them to, they'll send me a picture.
[00:07:38.320 --> 00:07:40.240] And it's some like really attractive woman.
[00:07:40.240 --> 00:07:45.360] And I have to keep reminding myself, this is some dude somewhere just sending me pictures, right?
[00:07:45.360 --> 00:07:46.080] But it's hard.
[00:07:46.080 --> 00:07:50.160] Even in my mind, I'm thinking, I wonder if, I wonder what I should say to her next.
[00:07:50.160 --> 00:07:51.760] It's like, no, sure, him.
[00:07:52.720 --> 00:07:55.040] You're not talking to this good-looking woman.
[00:07:55.040 --> 00:07:57.280] So sometimes I'll drag it out as long as I can.
[00:07:57.280 --> 00:08:04.240] Like one of them, I said, hey, I thought we agreed not to talk until the police have cooled off.
[00:08:04.240 --> 00:08:07.520] Or did you remember to ditch the gun and hide the body?
[00:08:07.680 --> 00:08:09.040] Stuff like that.
[00:08:09.920 --> 00:08:12.480] But I've never dragged it out long enough to know what the ask is.
[00:08:12.480 --> 00:08:16.240] You know, it's probably a gift card or Bitcoin or I don't know what.
[00:08:17.120 --> 00:08:20.800] But for sure, if I was 15 and didn't know, and here's, oh, here's a cute girl.
[00:08:20.800 --> 00:08:22.080] I think I'll send her a picture.
[00:08:22.400 --> 00:08:24.400] And it's just some dude somewhere, right?
[00:08:24.720 --> 00:08:25.440] That's right.
[00:08:25.440 --> 00:08:26.240] That's right.
[00:08:26.240 --> 00:08:26.560] Yeah.
[00:08:26.560 --> 00:08:27.920] And there's all kinds of stuff like that.
[00:08:27.920 --> 00:08:28.720] It's not just that.
[00:08:28.720 --> 00:08:42.080] There's also, you know, journalists have done incredible work, you know, uncovering some of this stuff, like the evil group on Discord that has blackmailed teen girls and boys and doing all kinds of crazy stuff.
[00:08:42.640 --> 00:08:44.400] It's really awful out there.
[00:08:44.400 --> 00:08:47.440] How do they blackmail him to get like the example of the $200?
[00:08:47.440 --> 00:08:51.200] I mean, a kid's not going to go to his parents and say, I need $200 to send to this person.
[00:08:51.200 --> 00:08:53.600] Is it like gift card purchases or what?
[00:08:53.600 --> 00:08:57.600] I'm not sure exactly how they do it, but it's often around nude pictures.
[00:08:57.600 --> 00:09:02.600] So if you don't send me this money, I will send this picture to all of your friends and family.
[00:09:02.600 --> 00:09:03.400] I see.
[00:09:03.400 --> 00:09:03.880] Wow.
[00:09:03.880 --> 00:09:05.080] Incredible.
[00:08:59.760 --> 00:09:06.600] Okay, some of the other mechanisms.
[00:09:06.920 --> 00:09:10.280] So sleep deprivation, teens need about nine hours.
[00:09:10.280 --> 00:09:12.760] They're now getting less than seven hours.
[00:09:13.400 --> 00:09:21.480] Yeah, so sleep deprivation started to spike again right around 2012, right-of-smart phones and social media became common.
[00:09:21.480 --> 00:09:23.240] And it makes sense when you think about it.
[00:09:23.240 --> 00:09:30.840] It's just so tempting to stay up late scrolling through social media, responding to texts, staying on those devices.
[00:09:31.000 --> 00:09:40.040] Common Sense Media did a study and found six out of 10 11 to 17 year olds was using their phone between midnight and 5 a.m.
[00:09:40.040 --> 00:09:41.400] on school night.
[00:09:41.400 --> 00:09:42.040] Wow.
[00:09:42.040 --> 00:09:44.920] The parents are just asleep in the other room.
[00:09:44.920 --> 00:09:45.640] Yes.
[00:09:45.640 --> 00:09:48.920] And they should be asleep, but their kids aren't.
[00:09:48.920 --> 00:09:50.920] And they're using their phones in the middle of the night.
[00:09:50.920 --> 00:09:52.440] I mean, that's rule number two in the book.
[00:09:52.440 --> 00:09:56.200] And I say it's the most important one: no devices in the bedroom overnight.
[00:09:56.200 --> 00:09:56.680] Right.
[00:09:56.920 --> 00:10:03.640] So just like when you sometimes you go to comedy clubs and they take your phone at the door and they put it in a bag and you pick it up when you're done.
[00:10:03.640 --> 00:10:04.920] Something like that.
[00:10:05.240 --> 00:10:07.000] Well, schools are using those now.
[00:10:07.000 --> 00:10:21.800] So the yonder pouches, so those got their start at concerts and comedy clubs, and then they expanded and now they're even more often being used at schools because that's been, thankfully, the new movement of no phones during the school day bell to bell.
[00:10:22.120 --> 00:10:22.760] Yeah.
[00:10:23.080 --> 00:10:28.040] I'm the perfect host for you because I have a nine-year-old.
[00:10:28.040 --> 00:10:32.040] Now he's not into social media, doesn't have a smartphone or anything, but the gaming tablets.
[00:10:32.040 --> 00:10:38.200] So he's into Minecraft and something called Geometry Dash, which is really wild.
[00:10:38.200 --> 00:10:40.360] But, you know, he could sit there for hours.
[00:10:40.360 --> 00:11:00.160] It's endless and and or when i'll drop him off at his buddy's house and go out for a two-hour bike ride and come back and they're still there you know with the devices it's like oh my god so i mean we have to like physically take it out of his hand you can just see the addiction he just can't stop wow it's really powerful Yeah, it absolutely is.
[00:11:00.160 --> 00:11:07.760] And, you know, gaming is interesting because it's not as obviously toxic as social media.
[00:11:07.760 --> 00:11:18.720] It's often done either with somebody in the same room or with or remotely with friends, which in real time, again, unlike social media, but still you have to set the limits for it.
[00:11:18.720 --> 00:11:25.360] You know, it can't be something that takes over the kid's life or, you know, is interfering with sleep or homework and so on.
[00:11:25.360 --> 00:11:30.000] And too often, that's what it becomes because of that feeling of addiction.
[00:11:30.000 --> 00:11:32.160] And it's reinforcing because it's also social.
[00:11:32.160 --> 00:11:36.960] I had no idea these games, you could see other people, your friends, playing the game right now.
[00:11:37.520 --> 00:11:40.640] Again, my son is like, hey, look, but so-and-so is on there right now.
[00:11:40.640 --> 00:11:41.920] I'm like, what?
[00:11:42.560 --> 00:11:44.160] And then that just keeps it going.
[00:11:44.160 --> 00:11:56.720] So just corporate-wise, these devices, games, social media platforms, they are really literally designed to keep eyeballs focused as long as possible because that's the business model.
[00:11:56.720 --> 00:11:57.760] Yeah, absolutely.
[00:11:57.760 --> 00:12:02.480] So that's why they poured, you know, millions, probably billions of dollars into these algorithms.
[00:12:02.480 --> 00:12:07.520] Looking for a smarter way to teach your child to ride a bike and support American jobs at the same time?
[00:12:07.520 --> 00:12:11.680] Most kids' bikes are cheap imports, heavy, clunky, and hard for kids to control.
[00:12:11.680 --> 00:12:13.840] Guardian Bikes is changing that.
[00:12:13.840 --> 00:12:17.520] Assembling bikes right here in the USA, with plans for full U.S.
[00:12:17.520 --> 00:12:19.440] manufacturing in the next few months.
[00:12:19.440 --> 00:12:23.520] It's a commitment to higher quality and American craftsmanship you can trust.
[00:12:23.520 --> 00:12:28.120] Each bike is lightweight, low to the ground, and built to help kids learn to ride faster.
[00:12:28.120 --> 00:12:29.280] Many in just one day.
[00:12:29.280 --> 00:12:30.600] No training wheels needed.
[00:12:29.840 --> 00:12:32.840] Guardian's patented sure stop braking system.
[00:12:33.080 --> 00:12:40.840] One lever stops both wheels, giving your child more control, faster stops, and prevents those scary head-over-handlebar accidents.
[00:12:40.840 --> 00:12:41.720] It's so easy.
[00:12:41.720 --> 00:12:43.320] Even a two-year-old can do it.
[00:12:43.320 --> 00:12:48.600] If you're ready to support American jobs and keep your kids safe, head to GuardianBikes.com today.
[00:12:48.600 --> 00:12:50.920] You'll save hundreds compared to the competition.
[00:12:50.920 --> 00:12:51.880] Join their newsletter.
[00:12:51.880 --> 00:12:55.000] You get a free bike lock and pump, a $50 value.
[00:12:55.000 --> 00:12:59.480] Guardian Bikes built in the USA, made specifically for kids.
[00:13:00.520 --> 00:13:10.920] That show people what the app thinks they want to see and has a notification so they go back as often as possible and spend as much time as possible.
[00:13:10.920 --> 00:13:12.680] So TikTok is a good example.
[00:13:12.680 --> 00:13:23.080] It's known as having the stickiest algorithm out there, which is probably why so many kids and teens spend hours and hours and hours on TikTok.
[00:13:23.080 --> 00:13:25.480] And that time sink has its issues.
[00:13:25.480 --> 00:13:31.720] The other thing is because that algorithm is showing them what it thinks they want to see.
[00:13:31.720 --> 00:13:35.240] It's showing them more of whatever they watch and whatever they linger on.
[00:13:35.240 --> 00:13:39.160] And think about what that's going to be for the average teen.
[00:13:39.160 --> 00:13:41.400] It's going to be stuff that's inappropriate.
[00:13:41.400 --> 00:13:42.520] It's going to be violence.
[00:13:42.520 --> 00:13:47.720] It's going to be just things that they're going to look at a little bit longer and then it's going to serve them up more of that.
[00:13:47.720 --> 00:14:01.480] And then what if you have a teen in a bad mood, which does happen, happens frequently, then they're sad, they're depressed, they watch stuff that's about sadness and depression, and then it goes down this rabbit hole of darkness.
[00:14:01.800 --> 00:14:02.600] Wow.
[00:14:02.920 --> 00:14:15.840] So, for those of us who don't know how the algorithms work, is it something like when I go on Amazon because I'm an author and I buy a book, then it pushes to me related books, which I like because I'm often interested in those particular topics.
[00:14:15.840 --> 00:14:17.040] So, it's something like that.
[00:14:14.760 --> 00:14:21.920] It's just automated and it's designed to just push stuff that you've already clicked on or something like that.
[00:14:22.240 --> 00:14:25.680] Yeah, so it shows you stuff that's related.
[00:14:25.680 --> 00:14:34.160] And, you know, when it's say a movie on Netflix or a book on Amazon, that may have its positives.
[00:14:34.160 --> 00:14:41.920] But when it's videos, especially for a teen and especially for a vulnerable teen, that's where you run into trouble.
[00:14:42.160 --> 00:14:49.600] We're talking about an adult and they just watch cooking videos and they get served up more cooking videos and they can put it down after half an hour or an hour.
[00:14:49.600 --> 00:14:51.600] That's not as obviously harmful.
[00:14:51.600 --> 00:14:59.520] But then you take a kid or a teen and they don't have the frontal lobe development to put it down and close that app.
[00:14:59.840 --> 00:15:11.440] They, you know, again, are what they're drawn to, just given their stage of brain development, is to look at those things that are probably not healthy and then they get served up more of it.
[00:15:12.720 --> 00:15:17.360] Yeah, that's why I've been watching so many World War II documentaries on Amazon Prime.
[00:15:17.360 --> 00:15:18.960] It keeps feeding me those.
[00:15:19.280 --> 00:15:21.200] Every time I log in, that's all I see.
[00:15:21.200 --> 00:15:23.120] Oh, World War II in color.
[00:15:23.120 --> 00:15:23.760] The best of all.
[00:15:24.400 --> 00:15:25.360] You can say no.
[00:15:25.360 --> 00:15:26.560] Just remember that.
[00:15:26.560 --> 00:15:27.680] Oh, oh.
[00:15:28.320 --> 00:15:30.880] I don't have any self-control even at age 70.
[00:15:30.880 --> 00:15:31.360] Okay.
[00:15:31.920 --> 00:15:35.200] As for you, let's go over your four different parenting styles.
[00:15:35.200 --> 00:15:36.800] I recognize myself with one of these.
[00:15:36.800 --> 00:15:39.040] Uninvolved parenting, that's not me.
[00:15:39.040 --> 00:15:41.200] Permissive parenting, that's me.
[00:15:41.200 --> 00:15:42.400] You know, so this is the thing.
[00:15:42.400 --> 00:15:44.400] I'm kind of non-confrontational.
[00:15:44.400 --> 00:15:45.760] I'm a softy.
[00:15:45.760 --> 00:15:50.080] Really, my sons, the discipline mostly comes from my German wife.
[00:15:50.400 --> 00:15:51.600] We're always on time.
[00:15:51.600 --> 00:15:53.200] The iPad is going down.
[00:15:53.200 --> 00:15:58.240] And he's like, if I'm there and my wife's gone, and he's like, dad, just five more minutes.
[00:15:58.240 --> 00:15:58.640] Okay.
[00:15:59.040 --> 00:15:59.960] Because I feel bad.
[00:15:59.960 --> 00:16:02.680] It's like, oh, it makes him so happy to do this.
[00:16:02.840 --> 00:16:03.720] I don't want to stop.
[00:15:59.760 --> 00:16:03.880] All right.
[00:16:03.960 --> 00:16:07.400] So, what do you say to us permissive parents?
[00:16:08.040 --> 00:16:11.480] Well, get disciplined.
[00:16:11.480 --> 00:16:12.680] Yes, get disciplined.
[00:16:12.680 --> 00:16:16.360] Now, look, I will admit I have sympathy for that approach.
[00:16:16.360 --> 00:16:18.600] I don't like fighting with my kids either.
[00:16:18.600 --> 00:16:19.800] I hate it.
[00:16:20.280 --> 00:16:24.360] But I have come to realize that it's necessary and that B.F.
[00:16:24.440 --> 00:16:25.880] Skinner was right.
[00:16:25.880 --> 00:16:31.160] That, you know, you do have to be a behaviorist as a parent because it's the only thing that works.
[00:16:31.160 --> 00:16:34.920] So rewards and punishments and more rewards than punishments.
[00:16:34.920 --> 00:16:36.440] It really is the only thing that works.
[00:16:36.440 --> 00:16:39.160] That's my conclusion after three kids.
[00:16:40.120 --> 00:16:49.080] So the problem with being too permissive is that your kid may end up being a terror.
[00:16:49.080 --> 00:16:54.040] Now, it sounds like you have a disciplinarian in your house, so that's helpful.
[00:16:54.840 --> 00:17:02.600] But when both parents are permissive and there's just very few rules, then it's hard for kids to adjust to school.
[00:17:02.600 --> 00:17:05.880] It's hard for them to adjust to college, to the workplace.
[00:17:07.640 --> 00:17:11.320] They don't have those boundaries, that structure in place.
[00:17:11.320 --> 00:17:13.240] And kids need that.
[00:17:13.240 --> 00:17:20.120] Now, they don't necessarily need the other extreme of really authoritarian, it's my way or the highway.
[00:17:20.120 --> 00:17:20.680] These are the rules.
[00:17:20.680 --> 00:17:21.880] I'm not going to explain them.
[00:17:21.880 --> 00:17:23.880] And I'm not even going to really show you much love.
[00:17:23.880 --> 00:17:25.400] We don't want to do that either.
[00:17:25.400 --> 00:17:28.120] So the happy medium is loving but firm.
[00:17:28.120 --> 00:17:31.000] So in psychology, it's called an authoritative.
[00:17:31.320 --> 00:17:33.160] And I call it dolphin parenting.
[00:17:33.160 --> 00:17:34.200] That's actually not my term.
[00:17:34.200 --> 00:17:36.680] I've seen it elsewhere, the whole book on dolphin parenting.
[00:17:36.680 --> 00:17:38.520] But it's the idea of firm but flexible.
[00:17:38.520 --> 00:17:42.920] That's an analogy my kids would call cringe, but it's useful.
[00:17:42.920 --> 00:17:45.280] And I like the idea of like, oh, I like sea animals.
[00:17:44.520 --> 00:17:47.120] Let's use sea animals for all of them.
[00:17:44.920 --> 00:17:49.200] So the completely uninvolved is fish.
[00:17:49.280 --> 00:17:53.040] The permissive is the you're a sea sponge, apparently.
[00:17:53.440 --> 00:17:56.160] And then the authoritarian is like the tiger shark.
[00:17:56.160 --> 00:17:58.640] Like you remember the whole like tiger mom, tiger parenting?
[00:17:58.800 --> 00:17:59.360] Yes, yeah.
[00:17:59.600 --> 00:18:05.280] Like, I you have to do X, Y, and Z because I want you to get into Harvard and be a Prodigy, and that's it.
[00:18:05.280 --> 00:18:07.120] And I'm not ever backing down.
[00:18:07.760 --> 00:18:11.280] Now, you know, I think she also showed love and so on.
[00:18:11.280 --> 00:18:16.080] So she wasn't, you know, as extreme as I think sometimes she was made out to be in that book.
[00:18:16.080 --> 00:18:17.920] But you talked about Amy?
[00:18:17.920 --> 00:18:18.960] Yeah, Amy Cowell.
[00:18:18.960 --> 00:18:19.200] Yeah.
[00:18:19.200 --> 00:18:19.680] I know her.
[00:18:19.680 --> 00:18:20.400] She's a good friend.
[00:18:20.400 --> 00:18:20.720] Yeah.
[00:18:20.960 --> 00:18:21.440] Yeah.
[00:18:21.600 --> 00:18:21.920] Yeah.
[00:18:21.920 --> 00:18:23.920] No, I thought the book was amazing.
[00:18:24.400 --> 00:18:26.880] I'm not trying to throw her under the bus by any means.
[00:18:26.880 --> 00:18:32.480] It's just that was the type of parenting that kind of came to mind.
[00:18:32.480 --> 00:18:35.760] And you think about the authoritarian of just like, that's it.
[00:18:36.640 --> 00:18:38.000] These are the rules.
[00:18:38.000 --> 00:18:40.960] And I mean, it's like the 1950s dad.
[00:18:40.960 --> 00:18:41.440] Yeah.
[00:18:41.680 --> 00:18:43.840] The really harsh parenting.
[00:18:43.840 --> 00:18:45.360] And I'm not going to play with you.
[00:18:45.360 --> 00:18:47.200] And I'm not really going to talk to you like a human being.
[00:18:47.200 --> 00:18:48.320] You just do what I say.
[00:18:48.400 --> 00:18:50.080] We don't want to go that far.
[00:18:50.400 --> 00:18:53.120] So that loving but firm really does work.
[00:18:53.120 --> 00:18:56.160] You know, you spend time with your kids, you talk to them.
[00:18:56.160 --> 00:19:02.800] But when you have rules, yeah, sure, maybe sometimes you can make a few exceptions, but the rules are the rules and you stick to them.
[00:19:02.800 --> 00:19:11.440] And that was part of the inspiration for this book: that happy medium of, yeah, you're going to talk to kids, but you also need to have guidelines in place.
[00:19:11.440 --> 00:19:12.880] Yeah, definitely.
[00:19:12.880 --> 00:19:13.680] Kids need that.
[00:19:13.680 --> 00:19:14.880] So that's under rule one.
[00:19:14.880 --> 00:19:15.920] You're in charge.
[00:19:15.920 --> 00:19:16.640] Be in charge.
[00:19:16.640 --> 00:19:18.320] Be firm but loving, and so on.
[00:19:18.320 --> 00:19:21.280] Number two, no electronic devices in the bedroom overnight.
[00:19:21.280 --> 00:19:22.240] We talked about that, right?
[00:19:22.240 --> 00:19:30.120] So, literally, just go into your kids' bedroom and just take out the iPad, the phone, whatever, the computer, and just put it in your own bedroom.
[00:19:30.120 --> 00:19:33.080] Yeah, well, there's a number of ways I go into all the different ways to do it.
[00:19:29.600 --> 00:19:35.400] And in my house, they go downstairs on the kitchen counter.
[00:19:35.640 --> 00:19:39.160] If I had any inkling they were using it in the middle of the night, I would lock it up.
[00:19:39.160 --> 00:19:40.280] That'd be the other thing.
[00:19:40.280 --> 00:19:47.160] So, you can put it in your bedroom, but if you sleep later than your kids, then that's a problem.
[00:19:47.160 --> 00:19:55.080] Like, you know, you have a 16-year-old driving herself to school, then that may not work if she's getting up earlier than you or staying up later than you, you know.
[00:19:55.080 --> 00:19:57.480] So, it may or may not work.
[00:19:57.720 --> 00:20:00.920] I have heard a lot of parents, that's how they do it, though.
[00:20:01.240 --> 00:20:04.680] Okay, then what do you get around the problem of the objection for the kids?
[00:20:04.680 --> 00:20:08.920] But all my friends are using it and I want to talk to them or whatever.
[00:20:09.640 --> 00:20:10.520] Right.
[00:20:10.840 --> 00:20:23.480] Well, the reason that number two of no devices in the bedroom overnight is the one I suggest you follow, even if you can't follow anything else, is yes, you know, you may want to talk to your friends, but you're not going to talk to your friends at 2 a.m.
[00:20:23.960 --> 00:20:24.840] Nope.
[00:20:24.840 --> 00:20:29.480] Non-starter because sleep is just so crucial for physical and mental health.
[00:20:29.480 --> 00:20:31.480] Not going to be something we do in this house.
[00:20:31.480 --> 00:20:37.400] So, I think that's something that even a C-Sponge parent can get behind, right?
[00:20:38.520 --> 00:20:43.640] So, but the overall argument of I'm going to be the only one.
[00:20:43.960 --> 00:20:52.440] So, sometimes, and I'm half joking and half not, I say that if your kid is the only one who doesn't have a smartphone or doesn't have social media, it means you won.
[00:20:52.760 --> 00:20:57.160] However, kids are still going to say that they are going to push back on that.
[00:20:57.160 --> 00:21:00.520] You know, I actually haven't found that to be too much of a problem.
[00:21:00.520 --> 00:21:03.240] And a lot of other parents I've talked to have said the same.
[00:21:03.240 --> 00:21:07.960] And I think it's because there are these solutions.
[00:21:07.960 --> 00:21:14.440] So, for one thing, your kid doesn't really need to have social media to communicate with their friends, they just don't.
[00:21:14.440 --> 00:21:17.040] There's so many other ways that they can communicate with their friends.
[00:21:14.920 --> 00:21:21.840] They can call, they can text, they can FaceTime, they can get together in person, they could go on a game.
[00:21:22.160 --> 00:21:24.800] You know, it doesn't have to be social media.
[00:21:24.800 --> 00:21:28.320] We're talking about TikTok and how it has that very sticky algorithm.
[00:21:28.320 --> 00:21:31.200] TikTok is not a way that teens communicate with their friends.
[00:21:31.200 --> 00:21:34.800] It's a way they see viral videos, which they didn't talk to their friends about.
[00:21:34.800 --> 00:21:42.640] But look, if the video is that viral, maybe their friend can show it to them on their phone, or maybe they can see it on their laptop briefly or something like that.
[00:21:42.640 --> 00:21:46.320] It doesn't, they don't have to be on TikTok themselves.
[00:21:46.640 --> 00:21:49.200] They can communicate in so many other ways.
[00:21:49.200 --> 00:21:58.800] And then, in terms of the smartphone, you know, I really think, so the rule that I say is give the first smartphone when they get their driver's license.
[00:21:58.800 --> 00:22:07.200] So, in most states, that's going to be 16, plus they have to get that all-important DMV appointment, which is apparently hard to get, my 15-year-old tells me.
[00:22:08.480 --> 00:22:13.120] But they get that, they get the license, then they can have the internet-enabled phone.
[00:22:13.120 --> 00:22:19.040] Before that, give them something, another type of phone they could text their friends on, but that does not have social media.
[00:22:19.040 --> 00:22:24.480] And very important for today's conversation, also, does not have AI chatbots.
[00:22:24.480 --> 00:22:27.760] Oh, so that was something that was in its infancy when I was writing this book.
[00:22:27.760 --> 00:22:35.840] Even though I wrote this book faster than any book I've ever written, now that's the other big issue parents have to be aware of: these AI boyfriends and girlfriends.
[00:22:35.840 --> 00:22:36.960] All right.
[00:22:37.280 --> 00:22:41.520] But chatbots, you mean like ChatGPT and Grok and those?
[00:22:41.840 --> 00:22:44.000] So there's different types.
[00:22:44.000 --> 00:22:52.880] So ChatGPT and Grok are kind of the general ones, and those have their own issues around kids using them to write essays and things like that.
[00:22:53.520 --> 00:23:05.000] But what is particularly concerning to me are the apps on phones that are AI apps specifically designed to be companions, often romantic companions.
[00:23:05.320 --> 00:23:08.680] And you think about this, you think about your kid, like think about your son.
[00:23:08.680 --> 00:23:14.600] Do you want his first experience with a quote romantic relationship to be with a chatbot?
[00:23:14.600 --> 00:23:15.080] No.
[00:23:15.720 --> 00:23:16.360] And right.
[00:23:16.360 --> 00:23:18.920] I mean, absolutely not, right?
[00:23:20.120 --> 00:23:22.120] They're good enough to fool people?
[00:23:22.920 --> 00:23:32.680] Well, they are good enough for young, for children and young teens to want to interact with them.
[00:23:32.680 --> 00:23:34.520] And there's all kinds of them out there.
[00:23:34.520 --> 00:23:38.280] So the band apps list on one of my kids' basic phones.
[00:23:38.280 --> 00:23:41.800] Yeah, they have a list of all the things that they just never allow.
[00:23:41.800 --> 00:23:45.640] And there's sexy chat, and there's the AI boyfriends and girlfriends.
[00:23:45.640 --> 00:23:49.800] And there's a lot of these specialized apps.
[00:23:49.800 --> 00:23:50.440] And they're new.
[00:23:50.440 --> 00:23:52.600] They're new in the last six months or so.
[00:23:52.600 --> 00:23:53.720] I haven't even followed that.
[00:23:54.120 --> 00:23:57.400] Do these kids know who they're actually talking to?
[00:23:57.400 --> 00:24:01.320] And it's not a person, but it's just so addicted they can't stop.
[00:24:01.320 --> 00:24:06.360] The AI has gotten so good that it feels like they're talking to someone.
[00:24:06.360 --> 00:24:16.280] And it's, it's, I mean, think about it this way: you know, the idea of teens like feeling like they have a relationship with the TV.
[00:24:16.280 --> 00:24:22.520] A mochi moment from Mark, who writes, I just want to thank you for making GOP1s affordable.
[00:24:22.520 --> 00:24:27.480] What would have been over $1,000 a month is just $99 a month with Mochi.
[00:24:27.480 --> 00:24:29.880] Money shouldn't be a barrier to healthy weight.
[00:24:29.880 --> 00:24:33.160] Three months in, and I have smaller jeans and a bigger wallet.
[00:24:33.160 --> 00:24:34.440] You're the best.
[00:24:34.440 --> 00:24:35.480] Thanks, Mark.
[00:24:35.480 --> 00:24:38.360] I'm Myra Ammeth, founder of Mochi Health.
[00:24:38.360 --> 00:24:42.280] To find your Mochi moment, visit joinmochi.com.
[00:24:42.280 --> 00:24:44.880] Mark is a mochi member compensated for his story.
[00:24:45.680 --> 00:24:48.160] Character or a celebrity?
[00:24:48.160 --> 00:24:49.600] That's happened for decades.
[00:24:49.600 --> 00:24:51.840] You remember the girl screaming for the Beatles?
[00:24:44.760 --> 00:24:52.000] Yes.
[00:24:52.960 --> 00:24:53.200] Right?
[00:24:53.200 --> 00:24:56.720] And they would decide, Am I a Paul girl or a Rico girl?
[00:24:56.880 --> 00:24:57.760] You know what I mean?
[00:24:57.760 --> 00:25:02.560] And so that kind of parasocial, or there's a term for this, I'm forgetting, but it's something like that.
[00:25:03.280 --> 00:25:18.240] That's that happens a lot at that age because they may not feel like they're ready for a real relationship, so they kind of practice, but they're practicing with an AI chatbot that is talking to them, always affirms everything.
[00:25:18.560 --> 00:25:23.200] It, you know, if you want it to be sexual, it'll be sexual all the time.
[00:25:23.200 --> 00:25:28.080] And what standard does that set for them having a relationship with an actual human being?
[00:25:28.080 --> 00:25:29.360] Not a good one.
[00:25:29.680 --> 00:25:35.360] No, because probably most people don't want to talk about sex all day like you do with your chatbot.
[00:25:35.360 --> 00:25:36.000] Okay, that's weird.
[00:25:36.000 --> 00:25:37.440] I didn't know about that one.
[00:25:38.560 --> 00:25:43.360] Okay, so these basic phones, I don't know anything about them.
[00:25:43.360 --> 00:25:46.800] These are like the old Motorola flip phones kind of thing?
[00:25:46.800 --> 00:25:47.360] Sort of.
[00:25:47.360 --> 00:25:50.400] Okay, you have a nine-year-old, so you, you are my target audience.
[00:25:50.400 --> 00:25:50.880] Yes, what do I do?
[00:25:53.040 --> 00:25:59.280] So the good news is the choice is not flip phone versus smartphone anymore.
[00:25:59.600 --> 00:26:16.080] So if you have a kid who's like, yeah, say 9, 10, 11, and they have a bus stop that's far away or they do club sports and there's issues around pickups or something and you just need really simple like texting, calling, you could get them a flip phone, you could give them a watch, although those have downsides.
[00:26:16.080 --> 00:26:17.840] We could get into that in a sec.
[00:26:19.200 --> 00:26:27.040] Once they get to the stage, around middle school, like 12 or so, they're probably going to want to text their friends.
[00:26:27.040 --> 00:26:31.480] Now, they can start out doing that from your phone, but that gets old after a while.
[00:26:31.480 --> 00:26:33.080] That's what we found.
[00:26:33.080 --> 00:26:36.440] So, we got our younger two kids these basic phones.
[00:26:36.440 --> 00:26:39.160] There's a bunch of different types, but they're not flip phones anymore.
[00:26:39.160 --> 00:26:44.440] They're usually Android or Google phones, and they are designed for kids.
[00:26:44.440 --> 00:26:48.120] So, a couple of examples: Gab, TrueMe, Pinwheel.
[00:26:48.120 --> 00:26:50.520] They have a parent portal.
[00:26:50.520 --> 00:27:02.840] And the basic principle is you can text, you can call, but generally, there is no internet browser, no social media, and no AI chatbots, no gambling apps.
[00:27:02.840 --> 00:27:05.720] You know, they have a whole list of the band apps.
[00:27:05.720 --> 00:27:10.520] So, you have the control as a parent to say, okay, you can have Duolingo.
[00:27:11.080 --> 00:27:18.600] Or this was a whole thing, but we told our 13-year-old, yeah, you can have Spotify, even though I had my doubts about it.
[00:27:18.600 --> 00:27:23.960] But she's a Swifty, she loves Taylor Swift, she wants to listen to Taylor Swift all the time on the bus.
[00:27:24.280 --> 00:27:26.200] And I'm like, she could do worse.
[00:27:26.200 --> 00:27:27.160] That works for me.
[00:27:27.160 --> 00:27:28.040] That's cool.
[00:27:29.320 --> 00:27:30.920] And so she really wanted Spotify.
[00:27:30.920 --> 00:27:39.880] And we tried other music apps that were, you know, maybe didn't have as many potential issues, but we couldn't get them to work on the phone.
[00:27:39.880 --> 00:27:41.640] So she has Spotify.
[00:27:41.640 --> 00:27:44.440] But she really doesn't have a whole lot else, like a couple of games.
[00:27:44.440 --> 00:27:46.040] She's got Duolingo.
[00:27:46.040 --> 00:27:47.720] And that's really it.
[00:27:47.720 --> 00:27:50.200] And so she's not getting on online.
[00:27:50.200 --> 00:27:57.560] And then we have to worry about her, you know, going to Pornhub, which, by the way, do you know how you get on Pornhub in most states, including California?
[00:27:58.120 --> 00:27:59.240] No, I've never even heard.
[00:27:59.320 --> 00:27:59.960] What is this called?
[00:27:59.960 --> 00:28:00.440] Pornhub?
[00:28:00.440 --> 00:28:01.080] I've never heard of it.
[00:28:01.560 --> 00:28:02.360] No, I've just kidding.
[00:28:02.680 --> 00:28:03.160] I'm kidding.
[00:28:03.160 --> 00:28:04.440] I'm kidding, everybody.
[00:28:05.400 --> 00:28:06.760] No, go ahead.
[00:28:07.080 --> 00:28:08.600] You click I am 18, enter.
[00:28:08.760 --> 00:28:09.080] That's it.
[00:28:09.320 --> 00:28:09.800] Yeah, okay.
[00:28:09.800 --> 00:28:10.920] So that's it.
[00:28:10.920 --> 00:28:11.320] Yeah.
[00:28:12.040 --> 00:28:17.520] The government is just not going to regulate these companies with this phony I'm 18 button.
[00:28:14.840 --> 00:28:18.000] Come on.
[00:28:18.560 --> 00:28:25.280] Well, Texas tried, and Pornhub pulled out of the state, and then the Supreme Court upheld the challenge to the Texas law.
[00:28:25.280 --> 00:28:28.160] So with pornography, maybe we might get some progress.
[00:28:28.160 --> 00:28:30.880] Social media, I'm not as sure will happen.
[00:28:30.880 --> 00:28:34.240] For now, in most states, it's still up to parents.
[00:28:34.960 --> 00:28:45.040] And so that's another reason to not give them that smartphone and to use parental controls when you do give them the smartphone and on the laptop and so on.
[00:28:45.040 --> 00:28:48.000] Yeah, well, you make the point, you know, your book is mainly designed for parents.
[00:28:48.000 --> 00:28:50.960] So this is bottom-up solutions to the problem.
[00:28:50.960 --> 00:28:58.560] What about regular old top-down government regulation of like smoking and drinking and voting and driving and going to war and everything else?
[00:28:58.560 --> 00:28:59.680] Why are they not?
[00:28:59.680 --> 00:29:03.760] Is it in the pockets of these huge tech companies that just have lobbyists?
[00:29:04.960 --> 00:29:05.760] Maybe.
[00:29:05.760 --> 00:29:07.760] I mean, look, I'm not an attorney.
[00:29:07.760 --> 00:29:11.440] I know there's legal arguments around this too, around the First Amendment and other things.
[00:29:12.160 --> 00:29:13.840] I know that's out there.
[00:29:13.840 --> 00:29:20.480] However, we're not talking about, well, we are.
[00:29:20.960 --> 00:29:24.960] We're talking about the problem of kids.
[00:29:24.960 --> 00:29:28.800] And again, not a lawyer.
[00:29:29.120 --> 00:29:38.320] I know that there's a lot of the arguments of the First Amendment and it's going to, you know, hinge on First Amendment rights and so on.
[00:29:38.320 --> 00:29:41.120] But come on, there's got to be a way that we can do this.
[00:29:41.120 --> 00:29:45.440] Maybe it's really simple age verification when you download an app.
[00:29:45.440 --> 00:29:47.840] That wouldn't be too intrusive to adults.
[00:29:47.840 --> 00:29:51.040] We can grandfather it in so you don't have to do it now.
[00:29:51.040 --> 00:29:54.720] And then that will protect the kids and young teens of the future.
[00:29:54.720 --> 00:29:59.560] So we've got to do something about age verification.
[00:29:59.560 --> 00:30:00.520] Like a driver's license.
[00:29:59.600 --> 00:30:03.320] Like at airports, now they just scan your driver's license.
[00:30:03.320 --> 00:30:03.720] Right.
[00:29:59.840 --> 00:30:05.480] So that's the obvious solution.
[00:30:05.800 --> 00:30:11.800] And then some people have qualms about that, but there are so many companies that do age verification now.
[00:30:11.800 --> 00:30:14.120] They have their own trade association.
[00:30:14.120 --> 00:30:16.680] And a lot of the techniques are not government ID.
[00:30:16.680 --> 00:30:17.400] They're other things.
[00:30:17.880 --> 00:30:20.520] It's some AI techniques or other things.
[00:30:20.520 --> 00:30:23.240] So this is a little outside my realm of expertise.
[00:30:23.240 --> 00:30:26.680] You know, I'm not a coder or age verification provider.
[00:30:26.680 --> 00:30:31.880] But point being, there's a lot of different ways it can be done.
[00:30:32.200 --> 00:30:35.720] It's just that you need to get the political will to get the regulation in place.
[00:30:35.720 --> 00:30:37.800] Because right now it's pretty much all up to parents.
[00:30:37.800 --> 00:30:43.400] And that's so, it puts parents in such a difficult, almost impossible position.
[00:30:43.400 --> 00:30:46.600] I mean, I wrote the book so parents know what they can do.
[00:30:46.600 --> 00:30:47.000] Yeah.
[00:30:47.000 --> 00:30:51.640] But I'm also the first to say, yeah, look, you know, it shouldn't all be on us.
[00:30:51.640 --> 00:30:53.320] It shouldn't be this time consuming.
[00:30:53.320 --> 00:30:55.720] It shouldn't take this much effort and money.
[00:30:55.880 --> 00:30:59.560] We should have a government that will protect our kids from this stuff.
[00:30:59.560 --> 00:31:04.280] Okay, I know you got to run to your next interview because you're on your book tour, but it's a collective action problem.
[00:31:04.280 --> 00:31:06.520] I've tracked this for a few years.
[00:31:06.520 --> 00:31:16.440] You and Jonathan Haidt and Lenore Skinese and others seem to have had some effect already with like schools now starting to ban phones during the school hours.
[00:31:16.440 --> 00:31:17.720] That's good.
[00:31:17.720 --> 00:31:19.320] Yeah, absolutely.
[00:31:19.320 --> 00:31:21.480] And so I hope that will continue.
[00:31:21.480 --> 00:31:24.120] But to your point, it is a collective action problem.
[00:31:24.120 --> 00:31:28.280] That issue we brought up earlier of kids saying, but I'm the only one.
[00:31:28.600 --> 00:31:32.760] Let's imagine a world where, like, Australia is trying to do this.
[00:31:32.760 --> 00:31:34.600] They already have this law passed.
[00:31:34.600 --> 00:31:40.120] We're going to raise the minimum age for social media to 16, and we're actually going to verify age.
[00:31:40.120 --> 00:31:44.560] Then, no parent with a kid 15 or under would have that problem.
[00:31:44.560 --> 00:31:46.960] But, but all my friends are on it, right?
[00:31:48.240 --> 00:31:50.560] All my friends are drinking, right?
[00:31:44.920 --> 00:31:53.520] I know, and that's that's what really gets me going, to be honest.
[00:31:53.520 --> 00:31:58.880] When people are like, Well, you know, oh, there's no one age, and every kid is different.
[00:31:58.880 --> 00:32:01.600] You know, why are you saying a specific age?
[00:32:01.920 --> 00:32:07.520] Really, are we going to say, you know, why didn't we say, oh, some 12-year-olds are ready to drive, and some 20-year-olds aren't?
[00:32:07.600 --> 00:32:08.640] It just depends on the kid.
[00:32:08.720 --> 00:32:14.160] No, we picked an age, and we stuck with it and we enforce it, and we should do the same here.
[00:32:14.160 --> 00:32:18.080] And then free range, get them outside, yes, yeah, I know.
[00:32:18.080 --> 00:32:21.200] So, that's that's rule eight, and I'm a huge believer in that too.
[00:32:21.280 --> 00:32:27.200] Huge fan of uh Lenora Skinese and uh her let-go organization.
[00:32:27.200 --> 00:32:38.800] I have a whole list of things in that chapter that you can do with kids 12 and under and 13 and up to teach them real-world skills, to give them independence, to give them, you know, tasks around the house.
[00:32:38.800 --> 00:32:45.760] Um, my favorite example is have them cook dinner every once in a while, then you don't have to cook dinner that night.
[00:32:45.760 --> 00:32:46.640] That's a good idea.
[00:32:46.720 --> 00:32:47.360] Love it.
[00:32:47.680 --> 00:32:52.960] I love public playgrounds, just go out and watch it's fun to just watch them under sunshine and air.
[00:32:53.440 --> 00:32:53.600] Exactly.
[00:32:53.760 --> 00:32:54.640] Yeah, what a difference.
[00:32:54.640 --> 00:32:56.640] Yeah, all right, thanks, Gene.
[00:32:56.640 --> 00:32:58.160] Thank you very much.
[00:33:15.760 --> 00:33:18.800] Tescarca laptop de DraftKings Sportsbook y user.
[00:33:18.680 --> 00:33:19.760] That's the Codego Hart.
[00:33:18.840 --> 00:33:25.400] This el Code go hard for receiving sientos dolores in bones pets at the instante when ages to first puesta de 5 dollars.
[00:33:25.400 --> 00:33:26.880] In collaboración with DraftKings.
[00:33:26.880 --> 00:33:27.920] La corona estu ya.
[00:33:28.000 --> 00:33:30.440] Problemas con el juego, yamal unaucho sientos gambler.
[00:33:28.960 --> 00:33:32.760] En Uvayor, yamal ocho si testiciete ocho hope and y.
[00:33:29.360 --> 00:33:36.360] On di unmensaje de texto con la palabra hope and y, alcatro seiete 36 nueve.
[00:33:36.520 --> 00:33:38.840] In Connecticut, Iuda disponible para problemas con el Juego.
[00:33:38.920 --> 00:33:43.240] Yamal Ocho Ocho Ocho, Ciete Ocho Nueve, Ci Te Ciete, Ciete Ciete, Ovisita C Peje, Punto Org.
[00:33:43.320 --> 00:33:47.720] Huega responsablente, en nomre de Budho casino en resort, Kansas, dercer mayor de 20 uno años.
[00:33:47.720 --> 00:33:49.000] Pueden applicarce cargos en Illinois.
[00:33:49.080 --> 00:33:51.880] La elhibilida varías segun la la juristición, no valido en Ontario.
[00:33:51.880 --> 00:33:53.720] Las apuestas de bonos zonde un sol uso.
[00:33:53.720 --> 00:33:56.840] La puesta, nos inclu y las canancias y caducan despueste ser otorgadas.
[00:33:56.840 --> 00:34:01.400] La offerta finalis el 2vinue de septiempre 2vi 15o a las unces encentenue de pene or a del este.
[00:34:01.400 --> 00:34:04.360] Los terminos en draft kings punto com para sportspook para promos.
[00:34:04.360 --> 00:34:06.120] Marketing is hard.
[00:34:06.440 --> 00:34:07.720] But I'll tell you a little secret.
[00:34:07.720 --> 00:34:08.520] It doesn't have to be.
[00:34:08.520 --> 00:34:09.800] Let me point something out.
[00:34:09.800 --> 00:34:12.120] You're listening to a podcast right now, and it's great.
[00:34:12.120 --> 00:34:12.920] You love the host.
[00:34:12.920 --> 00:34:14.200] You seek it out and download it.
[00:34:14.200 --> 00:34:18.040] You listen to it while driving, working out, cooking, even going to the bathroom.
[00:34:18.040 --> 00:34:20.840] Podcasts are a pretty close companion.
[00:34:20.840 --> 00:34:22.520] And this is a podcast ad.
[00:34:22.520 --> 00:34:23.960] Did I get your attention?
[00:34:23.960 --> 00:34:28.680] You can reach great listeners like yourself with podcast advertising from LibSyn Ads.
[00:34:28.680 --> 00:34:38.600] Choose from hundreds of top podcasts offering host endorsements or run a pre-produced ad like this one across thousands of shows to reach your target audience and their favorite podcasts with LibSyn ads.
[00:34:38.600 --> 00:34:40.200] Go to libsynads.com.
[00:34:40.200 --> 00:34:44.040] That's L-I-B-S-Y-N ads.com today.