Debug Information
Processing Details
- VTT File: swak425.vtt
- Processing Time: September 11, 2025 at 03:04 PM
- Total Chunks: 2
- Transcript Length: 83,398 characters
- Caption Count: 735 captions
Prompts Used
Prompt 1: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 1 of 2 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
[00:00:00.240 --> 00:00:05.680] I'm no tech genius, but I knew if I wanted my business to crush it, I needed a website now.
[00:00:05.680 --> 00:00:07.680] Thankfully, Bluehost made it easy.
[00:00:07.680 --> 00:00:12.400] I customized, optimized, and monetized everything exactly how I wanted with AI.
[00:00:12.400 --> 00:00:14.160] In minutes, my site was up.
[00:00:14.160 --> 00:00:15.200] I couldn't believe it.
[00:00:15.200 --> 00:00:18.240] The search engine tools even helped me get more site visitors.
[00:00:18.240 --> 00:00:21.680] Whatever your passion project is, you can set it up with Bluehost.
[00:00:21.680 --> 00:00:24.720] With their 30-day money-back guarantee, what do you got to lose?
[00:00:24.720 --> 00:00:26.400] Head to bluehost.com.
[00:00:26.400 --> 00:00:30.880] That's B-L-U-E-H-O-S-T.com to start now.
[00:00:37.600 --> 00:00:45.760] It is Thursday, the 31st of July, 2025, and you're listening to Skeptics with a K, the podcast for science, reason, and critical thinking.
[00:00:45.760 --> 00:00:56.960] Skeptics with a K is produced by Skeptic Media in association with the Merseyside Skeptic Society, a non-profit organization for the promotion of scientific skepticism on Merseyside around the UK and internationally.
[00:00:56.960 --> 00:00:58.320] I'm your host, Mike Hall.
[00:00:58.320 --> 00:00:59.520] With me today is Marsh.
[00:00:59.520 --> 00:01:00.080] Hello.
[00:01:00.080 --> 00:01:00.960] And Alice.
[00:01:00.960 --> 00:01:01.760] Hello.
[00:01:02.080 --> 00:01:07.920] Well, it's actually the time of the year where we ask people for their nominations for our Occam Awards.
[00:01:07.920 --> 00:01:08.960] The OCAMS.
[00:01:08.960 --> 00:01:09.840] The Occam Awards.
[00:01:09.840 --> 00:01:12.400] So the Occam Awards, these have been going for a long time.
[00:01:12.560 --> 00:01:17.600] This is before we even took over the Skeptic Magazine as editors and publisher of the magazine.
[00:01:17.600 --> 00:01:22.160] They're there to celebrate the very best in skeptical work over the last year.
[00:01:22.160 --> 00:01:34.080] So we ask for nominations as to who you think has done some excellent work, where have they had a wide reach, where they had an impact, maybe they've even changed something for the better, actually had a tangible impact in the world.
[00:01:34.080 --> 00:01:35.680] Who's doing some excellent work?
[00:01:35.680 --> 00:01:38.640] Or even who's doing work in the face of some serious adversity?
[00:01:38.640 --> 00:01:43.760] You know, who's actually coming up against some serious opposition from the pseudo-scientists and things out there.
[00:01:43.760 --> 00:01:46.480] So we always give out essentially the skeptic of the year award.
[00:01:46.480 --> 00:01:50.080] It's not quite called that, but that is essentially thematically what it's all about.
[00:01:50.080 --> 00:01:56.640] And then the other thing we will be giving out will be the Rusty Razor Award, which is the exact opposite of that.
[00:01:56.640 --> 00:01:59.880] So, who out there has been peddling pseudoscience?
[00:01:59.600 --> 00:02:02.840] What organization has been pushing misinformation?
[00:02:03.160 --> 00:02:12.760] What product is completely nonsense and woo in a way that we ask people to think about what harm has actually been done and how far has it actually got and who how many people are believing in?
[00:02:12.760 --> 00:02:15.000] Is this making the world appreciably worse?
[00:02:15.000 --> 00:02:24.120] Essentially, and if you go to the website of the skeptic magazine, so skeptic.org.uk, you can go there and you can find the nomination form for the Occams and the Rusty Razor.
[00:02:24.120 --> 00:02:28.520] And you can tell us who deserves either one or both, and you can tell us your justification.
[00:02:28.680 --> 00:02:43.240] We will look through the nominations, and as an editorial board, we'll decide who is the winner, and then we'll announce that winner of the skeptic of the year and of the rusty razor award at this year's QED on the Saturday night before the Gala Dinner, before the Gala Dinner, before the comedy at QED.
[00:02:43.240 --> 00:02:45.400] So, we announce it there on stage every year.
[00:02:45.400 --> 00:02:48.680] So, we want lots and lots of people to send us their nominations.
[00:02:52.520 --> 00:02:54.200] Porn porn, porny porn.
[00:02:54.200 --> 00:02:55.640] We're going to talk a lot about porn.
[00:02:55.640 --> 00:03:03.880] So, if you don't want to hear about porn or you're listening with someone who you don't want to hear about porn, then stop listening before we start talking about the porn, porn, porn, porn, porn.
[00:03:03.880 --> 00:03:10.200] Yeah, what if you're with somebody who you don't even want to know that porn as a concept exists?
[00:03:10.200 --> 00:03:12.760] Too late now, yeah, very much so.
[00:03:12.760 --> 00:03:17.080] That cat's out the back because it's not like you said it once, no, and it could have slipped by.
[00:03:17.080 --> 00:03:20.200] You very much made it front and center of every part of the conversation, yeah.
[00:03:20.200 --> 00:03:25.640] And there are much more succinct ways I could have said that, I imagine, but you know, fuck it, my show in it.
[00:03:25.640 --> 00:03:37.000] So, in the last few days, some key provisions of the Online Safety Act have come into effect following many, many years of delays and implementation difficulties.
[00:03:37.000 --> 00:03:53.280] The Act creates a duty of care for online service providers to not only take action against illegal content, which is commonly misunderstood to refer to sexual abuse material, although it's other stuff as well, but mostly that's where people's attention is focused.
[00:03:53.280 --> 00:04:00.720] But also legal content, which nevertheless may be harmful for children, which is commonly understood to mean porn.
[00:04:00.720 --> 00:04:01.520] Yes, yeah.
[00:04:01.840 --> 00:04:06.800] Now, we first talked about this way back in episode 113.
[00:04:06.800 --> 00:04:07.440] Did we?
[00:04:07.440 --> 00:04:16.800] When the David Cameron government was trying to implement very similar legislation at that point, where this was known as the David Cameron porn filter.
[00:04:16.800 --> 00:04:17.360] Yeah.
[00:04:17.600 --> 00:04:26.080] And our joke at the time was this is a filter introduced by David Cameron to remove porn, not a filter to remove David Cameron porn.
[00:04:26.400 --> 00:04:29.200] Although you should definitely be removing David Cameron.
[00:04:29.280 --> 00:04:31.680] It would actually function as porn, essentially, yes.
[00:04:32.000 --> 00:04:46.880] It was widely criticized at the time, the David Cameron porn filter, for being both a naive and misguided piece of legislation, which is what some people claimed, and other people claimed it was a deliberately Orwellian attempt to control the flow of information into the UK.
[00:04:46.880 --> 00:04:47.680] Yeah.
[00:04:48.320 --> 00:04:55.520] And I criticized it at the time for being based on a rather flimsy appeal to emotion, that being what think of the children.
[00:04:55.760 --> 00:04:56.880] Yeah, yeah, absolutely.
[00:04:56.880 --> 00:05:00.400] But also on the basis that it wouldn't work anyway.
[00:05:00.400 --> 00:05:05.120] The filter as proposed would not actually function in the way that they expected.
[00:05:05.120 --> 00:05:16.480] Think of the children is also a very common argument you'll find in the more pearl-clutchy end of politics, which can be both from the left or the right, but it's very, very commonly cited in there.
[00:05:16.480 --> 00:05:29.840] And I would also argue that that's a logical fallacy as well as a thought-terminating cliché in the sense that once someone has said, but think of the children, any attempt to continue the conversation is, well, you want to hurt children then.
[00:05:29.840 --> 00:05:30.680] Yeah, yeah.
[00:05:31.000 --> 00:05:36.440] And it's also the argument from children is a key indicator of a moral panic as well.
[00:05:36.440 --> 00:05:39.800] So many moral panics are founded on, but it's hurting our children.
[00:05:40.040 --> 00:05:40.760] Yeah, exactly.
[00:05:40.760 --> 00:05:58.360] And especially when they say children, you don't think people who might be 17 falling under the age of 18 who are caught up in bans on porn and therefore the very weird situation where somebody can be legally having sex with their person they are legally allowed to have sex with.
[00:05:58.360 --> 00:06:08.760] But if they were to film themselves doing so, they'd be committing a serious sexual offense, even with everybody involved completely consensually and those kind of weird things.
[00:06:08.760 --> 00:06:17.480] I think there was even a story of somebody who got into bother for possessing child sexual abuse material because he was 16 or 17 and it was a picture of himself.
[00:06:19.400 --> 00:06:28.120] The current version of the online safety bill was initially introduced by the Boris Johnson government, although it started being drafted under Theresa May.
[00:06:28.200 --> 00:06:30.360] It was actually introduced under Boris Johnson.
[00:06:30.360 --> 00:06:32.280] So like three prime ministers ago.
[00:06:32.280 --> 00:06:35.640] I would say it's hard to know how many prime ministers that actually is at this point.
[00:06:35.640 --> 00:06:42.040] Like when you said Boris Johnson, I genuinely had to stop and think, I can't remember when that was because so much has happened since then.
[00:06:42.440 --> 00:06:45.560] It is three prime ministers ago, but it has since become law.
[00:06:45.560 --> 00:06:45.960] Yeah.
[00:06:45.960 --> 00:06:48.840] It was given royal assent, I think, October 2023.
[00:06:48.840 --> 00:06:54.120] So just in the fading days of the Rishi Sunak government, this finally came in law.
[00:06:54.120 --> 00:06:57.000] And the provisions are starting to come into effect now.
[00:06:57.000 --> 00:06:59.880] And that is a weird quirk of British law.
[00:07:00.040 --> 00:07:12.760] And maybe not just British law, but of law generally, that a government that was very clearly dying, like the Rishi Sunak government, by the late late October of 2023, it was pretty clear it was not going to win the next election.
[00:07:13.240 --> 00:07:14.600] That was pretty nailed on.
[00:07:14.600 --> 00:07:21.680] So that government was able to pass a law that it handed to its successors to say, go on, then make this work.
[00:07:22.240 --> 00:07:22.960] And then left.
[00:07:22.960 --> 00:07:26.800] And now it comes in and it's like, how dare Kia's Dharma's government do this?
[00:07:26.800 --> 00:07:29.120] Yes, and I'm seeing a lot of commentary along those lines.
[00:07:29.440 --> 00:07:31.280] Yeah, Starmer's totalitarianism.
[00:07:31.280 --> 00:07:34.080] And look, there are things that they could have done.
[00:07:34.320 --> 00:07:36.640] They absolutely could have handled this way, way better.
[00:07:36.640 --> 00:07:36.880] Yeah.
[00:07:36.880 --> 00:07:38.400] But they didn't propose this law.
[00:07:38.400 --> 00:07:39.760] They were forced to go ahead with it.
[00:07:40.880 --> 00:07:41.200] Absolutely.
[00:07:41.760 --> 00:07:42.480] You know what I mean?
[00:07:42.480 --> 00:07:43.040] Yeah.
[00:07:43.360 --> 00:07:49.680] So as of July the 25th, 2025, so this is very, very recent.
[00:07:49.680 --> 00:07:54.880] Many websites have introduced mandatory age verification for UK users.
[00:07:55.440 --> 00:07:58.240] This includes porn sites, of course, which makes sense.
[00:07:58.240 --> 00:08:00.480] It also includes many not porn sites.
[00:08:00.480 --> 00:08:01.200] Which doesn't.
[00:08:01.200 --> 00:08:06.880] Because some of those not porn sites nevertheless host some explicit content.
[00:08:06.880 --> 00:08:12.080] So Blue Sky, for example, now requires age verification for users in the UK.
[00:08:12.080 --> 00:08:13.200] So do Reddit.
[00:08:13.200 --> 00:08:14.880] So does Discord.
[00:08:14.880 --> 00:08:22.000] And also, of course, Reddit and Discord, it depends on which Reddits you're looking at and which Discord servers you're signed up to.
[00:08:22.000 --> 00:08:27.520] But if you are signed up to ones that are deemed to be adult, then you have to age verify now.
[00:08:27.520 --> 00:08:33.920] And as well as that, you also have full-on proper porn sites like Pornhub and Red Tube and X Videos and so on.
[00:08:33.920 --> 00:08:36.640] And you have to age verify to access those now.
[00:08:36.640 --> 00:08:40.640] Which reminds me, this episode of Skeptics with the K is sponsored by X Videos.
[00:08:40.640 --> 00:08:44.080] Try X Videos in case you need a wink.
[00:08:45.040 --> 00:08:45.520] So anyway.
[00:08:45.680 --> 00:08:46.320] X Videos.
[00:08:46.320 --> 00:08:47.920] I thought it was 10 videos.
[00:08:48.400 --> 00:08:49.200] Only 10 of them on the 10th.
[00:08:49.280 --> 00:08:50.160] Yeah, they've been on the website.
[00:08:50.160 --> 00:08:51.360] I assume there's only the documents.
[00:08:51.000 --> 00:08:52.080] There's just a dozen.
[00:08:52.080 --> 00:08:55.360] Yeah, you have to change them regularly when a new piece of porn comes out.
[00:08:56.000 --> 00:08:59.720] It's just the most recent 10 porn videos on the internet, is what it is.
[00:08:59.600 --> 00:09:05.320] So they change, they update every week or so when more content gets added to the internet.
[00:09:06.280 --> 00:09:08.440] Anyway, so this is now the law of the land.
[00:09:08.440 --> 00:09:12.280] But there are some significant issues with this legislation.
[00:09:12.280 --> 00:09:20.040] So in the interests of locking the stable door after the horse has bolted, I want to talk about what the issues with this are.
[00:09:20.040 --> 00:09:21.640] Because there's nothing we can do about it now.
[00:09:21.640 --> 00:09:23.080] It's already come into effect.
[00:09:23.080 --> 00:09:27.400] There is a parliamentary petition saying please repeal the Online Safety Act.
[00:09:27.640 --> 00:09:29.640] It's had over 300,000 signatures.
[00:09:29.640 --> 00:09:31.640] It's due a response and a debate.
[00:09:31.640 --> 00:09:34.040] It's not going to get anywhere because those positions never do.
[00:09:34.040 --> 00:09:36.840] I think it's been one ever since they were introduced.
[00:09:36.840 --> 00:09:43.560] And I think that was by the Gordon Brown government that introduced those petitions and one has ever actually changed the law.
[00:09:43.560 --> 00:09:44.840] Here's the thing though.
[00:09:44.840 --> 00:09:50.680] How good are the spam filters and text filters on the petition website?
[00:09:50.680 --> 00:10:09.320] Because could you sign up with names that had inserts of explicit material, therefore forcing the petition website to be the publisher of explicit material and therefore forcing the government's petition website to require age verification for accessing?
[00:10:09.640 --> 00:10:11.080] Possibly.
[00:10:11.400 --> 00:10:13.400] I think someone should try that.
[00:10:14.360 --> 00:10:21.240] So as mentioned, many platforms have introduced age verification to comply with the requirements of this bill.
[00:10:21.240 --> 00:10:29.320] So if you want to access this sort of content, probably because you fancy having a wank, you now have to demonstrate that you are over the age of 18 to do it.
[00:10:29.320 --> 00:10:31.880] And happily, teenagers never want a wank.
[00:10:32.040 --> 00:10:32.520] No, they don't.
[00:10:32.760 --> 00:10:33.720] Famously.
[00:10:33.720 --> 00:10:37.400] So we lucked out there, because otherwise that could have been really awkward.
[00:10:37.400 --> 00:10:42.760] Blue Sky, for example, have implemented this using a service called KWS, which is kids' web services.
[00:10:42.760 --> 00:10:46.800] That is a service provided by Epic Games, actually the people who made Fortnite.
[00:10:44.760 --> 00:10:50.560] So they had this service off the shelf that they were using for age-verifying players anyway.
[00:10:50.880 --> 00:10:56.800] Blue Sky has integrated that into their system, and this verifies you were an adult using one of two mechanisms.
[00:10:56.800 --> 00:11:04.240] You can either enter your credit card number or you can have an AI look at you and guess how old you are by looking at your face.
[00:11:04.880 --> 00:11:06.720] I bet that works spectacularly well.
[00:11:06.880 --> 00:11:07.360] It does.
[00:11:07.360 --> 00:11:08.880] And it's unhackable.
[00:11:09.520 --> 00:11:14.400] Both of these mechanisms have got significant problems with both false positives and false negatives.
[00:11:14.400 --> 00:11:18.080] Have you seen how good the youth of today are at makeup?
[00:11:18.080 --> 00:11:23.840] I don't consider incredible at makeup.
[00:11:23.840 --> 00:11:29.920] Like, the younger generations are properly nailed on makeup artists, essentially.
[00:11:29.920 --> 00:11:34.640] I mean, to be honest, if they're going through the effort of putting makeup on, they're doing it the hard way to get around these.
[00:11:34.720 --> 00:11:35.440] Oh, well, yes, true.
[00:11:36.400 --> 00:11:45.600] And the good thing about the credit card is there's never a point in any teenagers' history where they are anywhere near a parent's credit card for a minute's worth of time.
[00:11:45.760 --> 00:11:46.320] Never happens.
[00:11:46.480 --> 00:11:48.000] We'll come back to that.
[00:11:48.000 --> 00:11:53.200] So, there is a very well-regarded organization in the United States called NIST, which is part of the U.S.
[00:11:53.200 --> 00:11:53.600] government.
[00:11:53.600 --> 00:11:56.800] It's the National Institute of Standards and Technology.
[00:11:56.800 --> 00:12:02.720] And last year, they published their latest evaluation of age estimation software.
[00:12:02.720 --> 00:12:15.520] So, for this evaluation, they took 11 million pictures of people with known ages and ran them through six different software systems that estimated the age of the person based on the picture.
[00:12:15.840 --> 00:12:18.640] And we're going to look at three of their key metrics.
[00:12:18.640 --> 00:12:19.200] Okay.
[00:12:19.840 --> 00:12:21.880] So, the first one, the first metric is.
[00:12:22.120 --> 00:12:26.560] Is it where they're wearing a badge saying 21 today.
[00:12:26.560 --> 00:12:28.640] Yes, it's a dead giveaway.
[00:12:28.640 --> 00:12:29.040] Yeah.
[00:12:29.040 --> 00:12:31.720] The first one is mean absolute error.
[00:12:29.840 --> 00:12:36.520] So this is how far away from the person's true age is the computer's guess.
[00:12:36.840 --> 00:12:38.600] So mean absolute error.
[00:12:38.600 --> 00:12:52.600] And NIST reported that depending on several factors like the quality of the photograph and which algorithm was used, the mean absolute error ranges from 2.3 years to 5.1 years, which is quite a large error, actually.
[00:12:52.600 --> 00:13:00.520] It's maybe not so much of a problem if you're 56 and the computer incorrectly guesses that you're 51, because it doesn't make a blind bit of difference, really.
[00:13:00.520 --> 00:14:01.200] But it makes a huge difference when you're 23 and the computer thinks you look 17 and now stops you from accessing things you are legally part of your life you are legally allowed to explore yes that you'll be locked out because a computer says no and there's no way for you to particularly argue with that yeah i'm interested i don't know i assume you've not covered this because it's not relevant to the point you're making but i'm interested if humans are better at that especially if you're seeing a person in person because of other like unmeasurable cues that we just automatically pick up humans can be shit at it as well but i'd be i'd be kind of interested to know if humans are a bit better than machines the problem from a from a purely theoretical point of view that would be an interesting question from a policy point of view what will happen is the humans will ask everyone we know that because what we're talking about is challenge 25 that we found out alcohol forever i don't want to rely on whether i'm good at guessing this so i will just ask i'll just make everybody provide identification before they access alcohol.
[00:14:01.520 --> 00:14:04.320] I'm not making a point at all at all about it.
[00:14:04.640 --> 00:14:06.560] Just vaguely out of interest.
[00:14:06.560 --> 00:14:13.760] Yeah, but it feels like something that humans would be better at than machines for some intangible, unknowable, unmeasurable reason.
[00:14:13.760 --> 00:14:23.760] But the reason I'm bringing that together is I think this is downstream of the challenge 21, challenge 25 kind of mentality of like, well, you have to prove it at this point, otherwise, we're going to draw a line.
[00:14:23.760 --> 00:14:31.440] And it's the same sort of thing to me: is that now you have to prove your age for these other services, and before you know it, it's ID for everything.
[00:14:31.440 --> 00:14:33.360] I know that sounds slippery slope, but yeah.
[00:14:33.680 --> 00:14:37.280] So, yeah, it's a problem if you're 23 and the computer says you look 17.
[00:14:37.280 --> 00:14:42.320] It's arguably a bigger problem if the computer says you look 18, but you're actually 12.
[00:14:42.560 --> 00:14:42.880] Yes.
[00:14:42.880 --> 00:14:43.840] Yes, yeah, yeah, yeah.
[00:14:43.840 --> 00:14:49.360] Because the stated goal here is to protect children from accessing content which the government says is harmful.
[00:14:49.360 --> 00:14:58.160] And that fails spectacularly when your software lets through 12 and 13 year olds because the face estimation software has got error bars of five and a bit years.
[00:14:58.160 --> 00:15:14.800] I wonder whether those error bars are weighted in one direction, though, whether it's more likely to underestimate your age rather than overestimate it for very significant, like 12, you're unlikely to mistake a 12-year-old for an 18-year-old because of the physiological changes that happen very, very quickly between the ages of 11 and 15 or whatever.
[00:15:14.800 --> 00:15:17.440] Possibly, we've got other metrics that we can look at for that.
[00:15:17.440 --> 00:15:18.000] Okay.
[00:15:18.000 --> 00:15:28.800] So, NIST were, of course, mindful that the true purpose of face age estimation software is to facilitate access to adult services, whether that's porn or buying alcohol or whatever.
[00:15:29.120 --> 00:15:34.320] So, the second metric that they looked at was the binary false negative rate.
[00:15:34.320 --> 00:15:38.640] So, that is to say, if you ask the software, is this person of legal age?
[00:15:38.640 --> 00:15:42.400] How often does the software say no, even when you are?
[00:15:42.800 --> 00:15:43.520] Right?
[00:15:43.840 --> 00:15:50.160] And what they found in that test, was errors in the range between 4% and 30%.
[00:15:50.480 --> 00:16:05.080] So, that is to say, for people who were adults, between 4 and 30% of the time, depending on the algorithm used, the quality of the picture, that sort of stuff, between 4 and 30% of the time, the software would say, no, sorry, you're too young, and would deny access to that.
[00:16:05.880 --> 00:16:08.680] That's nearly a third of times on the outside edge.
[00:16:08.680 --> 00:16:14.840] Did they break any of that down by, for example, protected characteristics status?
[00:16:14.840 --> 00:16:20.120] Yes, well, we'll come to the demographic confounders in a moment as well.
[00:16:20.120 --> 00:16:27.800] So the flip side of that, again, arguably more concerning given the goals of this legislation, is the binary false positive rate.
[00:16:28.200 --> 00:16:29.880] So that's the third metric.
[00:16:29.880 --> 00:16:37.640] NIST took images of children aged 13 to 17 and again asked the software, is this person of age?
[00:16:37.640 --> 00:16:42.520] And in some cases, the error rate was up to 45% in these cases.
[00:16:42.520 --> 00:16:49.160] So in 45% of cases, again, depending on the quality of the picture, which algorithm was being used, that sort of thing.
[00:16:49.160 --> 00:16:58.040] But in up to 45% of the cases, it said, yes, that person is an adult when they were aged between 13 and 17 years old.
[00:16:59.400 --> 00:17:09.720] So in this case, in the case before, where for the false negative rate, in the event of an error, an adult is denied access to services that they should be allowed to access.
[00:17:10.040 --> 00:17:17.400] In the case of the false positive error, a child is granted access to pornography, which based on this face age estimation software.
[00:17:17.400 --> 00:17:19.000] So that's not brilliant, right?
[00:17:19.640 --> 00:17:38.120] So from both ends of this argument, whether your position is this is censorship, this is terrible, adults should be able to access whatever they want, and now 30% of adults can't, or whether your position is think of the children, they need to be protected from the heinous things in this world, but 45% of children are granted access to this anyway.
[00:17:38.600 --> 00:17:42.040] Either way round, you look at this, the system is not sufficient.
[00:17:42.680 --> 00:17:45.360] And nobody should be happy with these metrics.
[00:17:44.520 --> 00:17:49.120] These problems are then further exacerbated by demographic biases.
[00:17:49.200 --> 00:18:06.480] So, for white faces compared to Asian and African faces, NIST estimates that the error rates for Asian and African faces are one to two orders of magnitude higher, with some software estimating the face of Africans with an error of over 40 years.
[00:18:06.480 --> 00:18:08.480] Now, that's on the young side.
[00:18:08.480 --> 00:18:12.080] It's older Africans being estimated as much younger than they are.
[00:18:12.080 --> 00:18:19.200] But nevertheless, this age-guessing software is dreadful at guessing the age of non-white people.
[00:18:19.200 --> 00:18:20.000] Yeah.
[00:18:20.320 --> 00:18:25.360] In terms of other protected characteristics, they don't look at that in the paper that I looked at.
[00:18:25.360 --> 00:18:29.840] But I can imagine certain disabilities being impacted significantly.
[00:18:30.320 --> 00:18:30.880] Yeah.
[00:18:31.840 --> 00:18:38.640] The fallback mechanism that's used by many of these services is the old favorite of the Porny website, which is the credit card check.
[00:18:38.640 --> 00:18:47.440] So, if you fail the face age or you just don't fancy the face age, the software can say, all right, you look young to me, so try this instead.
[00:18:47.760 --> 00:18:55.440] And children cannot, they can legally have a debit card in the UK, but you cannot get a credit card unless you are aged 18.
[00:18:55.680 --> 00:18:57.440] Oh, it has to be a credit card, not a debit card.
[00:18:57.440 --> 00:18:58.240] It has to be a credit card.
[00:18:58.480 --> 00:18:58.960] Oh, my God.
[00:18:59.200 --> 00:19:07.360] That's a major issue because just having a credit card is a risk factor for getting into credit card debt.
[00:19:07.360 --> 00:19:08.080] Yes.
[00:19:08.400 --> 00:19:13.760] And so, some services like Blue Sky have implemented a credit card check as a fallback mechanism.
[00:19:13.760 --> 00:19:19.280] So, they don't charge your card, or if they do, they'll charge a nominal amount, like 5p.
[00:19:19.600 --> 00:19:24.800] But all they're doing is checking: is this card valid and is it a credit, not a debit card?
[00:19:24.800 --> 00:19:28.560] Because it has to be a credit card because credit cards are only available to over 18s.
[00:19:28.640 --> 00:19:30.600] Debit card, you get one when you're fucking 12, right?
[00:19:29.920 --> 00:19:36.840] But they're, of course, not matching it to your name and shouldn't be matching it to your name because that would be majorly problematic.
[00:19:37.160 --> 00:19:42.200] But also, that means, as you say, you can go into your mom's wallet and grab a credit card.
[00:19:42.200 --> 00:19:46.520] I mean, and obviously, yes, you have to give someone credit card details for that.
[00:19:46.520 --> 00:19:49.080] And okay, I really, really trust this system they put in.
[00:19:49.080 --> 00:19:51.160] I really trust whoever they've hired to do this.
[00:19:51.160 --> 00:19:53.160] And the good thing is, these things never get hacked.
[00:19:53.320 --> 00:19:54.280] Never ever get hacked.
[00:19:54.280 --> 00:19:56.280] No, so there's a few problems with that.
[00:19:56.280 --> 00:20:01.640] We've touched on a couple of them, but many adults, especially those on low incomes, do not have credit cards.
[00:20:01.640 --> 00:20:02.360] Yep.
[00:20:02.360 --> 00:20:07.960] So, according to the website Merchant Savvy, in 2024, 35% of adults do not have a credit card.
[00:20:07.960 --> 00:20:09.640] Yeah, yeah, neither of my parents had credit cards.
[00:20:09.640 --> 00:20:13.320] I grew up, we never had a credit card in the house the first time with a credit card.
[00:20:13.320 --> 00:20:15.400] I got it maybe eight years ago.
[00:20:15.400 --> 00:20:21.880] Yeah, that goes up to 71% of adults aged between 18 and 24 who do not have a credit card.
[00:20:21.880 --> 00:20:25.720] And these are the adults who are the ones most likely to fail the face age check.
[00:20:25.800 --> 00:20:27.800] Yeah, just like because they're closer to the calf.
[00:20:27.800 --> 00:20:28.040] Yeah.
[00:20:28.040 --> 00:20:28.600] Yeah.
[00:20:28.600 --> 00:20:31.960] 71% of adults age 18 to 24 do not have a credit card.
[00:20:31.960 --> 00:20:37.560] So even this fallback mechanism is locking adults out of legal content.
[00:20:37.560 --> 00:20:41.240] Albeit adult content, but content that they are legally able to access.
[00:20:41.240 --> 00:20:45.160] Or forcing them to get a credit card, forcing them to go through a credit card check.
[00:20:45.160 --> 00:20:49.640] Possibly damaging their credit rating, possibly encouraging them to get into debt and so on.
[00:20:49.640 --> 00:20:56.200] It is literally saying if you want to access OnlyFans, you first need to have a credit card that you put into the website, OnlyFans.
[00:20:56.920 --> 00:20:58.280] Now don't get into bother.
[00:20:58.280 --> 00:21:07.560] So assuming an adult population of 55 million people, if 30% of them fail the phase age check, that's 16.5 million people who fall back to the credit card check.
[00:21:07.560 --> 00:21:16.560] 35% of those don't have credit cards, that's 5.5 million adults who could be denied lawful access to content in the name of protecting children.
[00:21:14.760 --> 00:21:18.400] Obviously, that's back of the envelope numbers.
[00:21:18.480 --> 00:21:24.160] It's not going to be as straightforward as that, but that's you know, that's roughly where we're going to be.
[00:21:24.800 --> 00:21:27.520] So that's the false negative rate for credit cards.
[00:21:27.520 --> 00:21:29.040] What about the false positive rate?
[00:21:29.040 --> 00:21:35.360] How often do credit do credit card checks effectively prevent children from accessing adult content?
[00:21:36.000 --> 00:21:44.640] The Online Safety Act explicitly says a credit card is an acceptable tool for the purposes of age verification.
[00:21:44.640 --> 00:21:56.240] But it is not difficult to imagine, as we've already said, a 14, 15, 16-year-old wanting access to some assault service and asking their mum and dad to lend them a card or just taking the card without permission.
[00:21:56.240 --> 00:21:59.920] And Tesco's won't let you do it for buying alcohol.
[00:21:59.920 --> 00:22:15.040] I've been into Tesco before or any other supermarket before where I've been age verified for buying alcohol and not had my ID with me because I didn't think I needed an ID at 35 and gone, well, look, I've got a credit card, like at least that proves them over 18, right?
[00:22:15.040 --> 00:22:16.480] And they've gone, no, we won't take that.
[00:22:16.640 --> 00:22:16.960] Yeah.
[00:22:16.960 --> 00:22:21.840] Well, the Online Safety Act says it is explicitly acceptable as a form of ID.
[00:22:21.840 --> 00:22:37.200] I have not been able to find specific statistics about the misuse of credit cards by minors in the UK, but the Citizens Advice website says that it is, quote, quite common for a family member to use your card without permission.
[00:22:37.840 --> 00:22:46.080] But even leaving aside fraudulent use, the use of Discord, for example, is extremely widespread amongst teenagers.
[00:22:46.080 --> 00:22:50.080] It's a very, very widely used platform for friends to communicate with each other.
[00:22:50.080 --> 00:22:54.080] And now, Discord is requiring age verification to access adult content.
[00:22:54.080 --> 00:23:09.320] However, mum and dad might not realize that that's what the verification is for and may allow their child to use the card to verify with Discord, not really appreciating that this isn't a simple verify my account, it's an age check to access adult content.
[00:23:09.320 --> 00:23:15.240] Or might know that's what it's for and just trust the kids to make an educational choice for themselves and pop it in.
[00:23:15.240 --> 00:23:22.200] I'm sure plenty of parents will be like, well, I mean, how many parents let the kids play age in appropriate according to the scoring system?
[00:23:22.200 --> 00:23:22.760] Video games.
[00:23:23.000 --> 00:23:26.920] Video games or watch fucking Terminator when they're eight or whatever.
[00:23:27.240 --> 00:23:28.040] Yeah, yeah.
[00:23:28.040 --> 00:23:35.160] Another way to circumvent these checks, which is proving to be very common, is to use a video game with what's called photo mode enabled.
[00:23:35.160 --> 00:23:44.760] So when you're doing the face age estimation checks, typically they will ask you to perform certain actions so the software can tell you're not just holding a photo up of your granddad or something like that.
[00:23:44.760 --> 00:23:48.120] So it might say open your mouth, turn your head left, things like that.
[00:23:48.120 --> 00:23:49.880] You'll have done this when you sign it with Monzo.
[00:23:50.200 --> 00:23:53.160] They've got very similar read a sentence out, yeah.
[00:23:53.160 --> 00:23:53.800] Yeah.
[00:23:54.120 --> 00:24:02.760] But these are also actions that you can get in-game characters to perform in photo mode in games like Death Stranding.
[00:24:03.080 --> 00:24:06.120] So you can make the game character pose in a certain way.
[00:24:06.120 --> 00:24:09.960] You point your camera at the screen and it's a photo reel game character.
[00:24:09.960 --> 00:24:21.720] And when the verification tool says, right now open your mouth, you make it open its mouth and then Shalomy Gallami Zoop, there it is, you're approved without pinching anybody's credit card and while you're still 13 years old.
[00:24:22.040 --> 00:24:36.840] So there is comprehensive empirical research that shows that age verification technologies like this fail to meaningfully protect children while also creating substantial privacy and accessibility risks to adult consumers of adult content.
[00:24:36.840 --> 00:24:48.640] Because many people, of course, even if they have a credit card, even if they do have a government ID, even if they would pass the face age checks, who wants their photograph, passport, and credit card associated with an online porn site?
[00:24:48.640 --> 00:24:49.440] Yeah.
[00:24:49.760 --> 00:24:52.800] And it's not even just porn, it's what's deemed adult content.
[00:24:52.800 --> 00:25:00.160] And that category can contain information about gender diversity, information about sexual sexuality diversity.
[00:25:00.160 --> 00:25:08.800] And who wants to give their face and credit card number to the website they visit to figure out if they're gay or not, to explore whether they're gay or not?
[00:25:09.120 --> 00:25:15.760] Rightly or wrongly, we've got a culture which is still very judgmental about porn and the use of pornography as an aid to sex or masturbation.
[00:25:15.760 --> 00:25:23.120] And that social shame will be significant enough a pressure for some people that this effectively amounts to government censorship.
[00:25:23.120 --> 00:25:23.680] Yes.
[00:25:23.680 --> 00:25:26.960] Because there is enough pressure from just from that social stigma.
[00:25:26.960 --> 00:25:39.360] I might add at this point, although this is not the roads that I'm going to go down, that might very well be the point because this sort of age verification legislation is being pushed and lobbied for very heavily by evangelical Christian countries.
[00:25:39.360 --> 00:25:40.080] Yes, it is, yeah.
[00:25:40.080 --> 00:25:50.160] Both here and in the US, with the express intention of using that social stigma to stop even adults from accessing pornography, but dressing it up as child protection.
[00:25:50.480 --> 00:26:01.120] So moving on from the efficacy of verification tools, how effective are the age gate blocks themselves at preventing access to adult content?
[00:26:01.440 --> 00:26:04.400] So there are a couple of components to this.
[00:26:04.720 --> 00:26:07.920] For one particular adult website, it's not one that I've mentioned.
[00:26:07.920 --> 00:26:10.560] I'm not going to mention it, but it is a proper porn site.
[00:26:10.560 --> 00:26:15.680] It's not like Reddit or something where just it's an ordinary site, but some people post porn on it.
[00:26:15.680 --> 00:26:24.720] Upon visiting this site, I was greeted with a big verify your age overlay, and I opened DevTools on Chrome and deleted it and then watched the porn.
[00:26:25.680 --> 00:26:27.360] Absolutely fine, no problem at all.
[00:26:27.360 --> 00:26:27.840] Brilliant.
[00:26:27.800 --> 00:26:28.040] Yep.
[00:26:28.240 --> 00:26:36.120] I was also able to do the same thing from my phone with a feature that Apple have called hide distracting items, which is meant to be used for removing ads.
[00:26:36.200 --> 00:26:39.720] So you can click on the ad and say hide this, and it just deletes it.
[00:26:39.720 --> 00:26:44.520] Also removes the age verification pop-up and just you can watch the porn that's there anyway.
[00:26:44.520 --> 00:26:49.320] Yeah, but the hide distracting items also completely remove what you were looking for in the porn video.
[00:26:49.320 --> 00:26:52.200] So no, this would be way too distracting for you.
[00:26:52.520 --> 00:26:57.880] Now, this isn't something that works on every single adult website, but it works on at least some.
[00:26:57.880 --> 00:27:05.400] It would also be trivial enough to code up a browser extension for Chrome that could detect and remove these age verification overlays.
[00:27:05.400 --> 00:27:18.360] I imagine many children would know that friend who knows how DevTools works or can get them a kind of inline script that's called a bookmarklet where it becomes a bookmark on your button and you just click that and it would disappear.
[00:27:18.360 --> 00:27:24.360] When I was a teenager, my dad put a block on MSN Messenger.
[00:27:24.360 --> 00:27:25.880] Remember when I was a message?
[00:27:26.040 --> 00:27:28.280] So you could only use it for an hour at a time.
[00:27:28.600 --> 00:27:32.440] But I was in a long-distance relationship with Warren at that time, so I wanted to speak to him with mum.
[00:27:32.680 --> 00:27:33.960] I knew all the routes to get around it.
[00:27:33.960 --> 00:27:39.640] Like you just, I was on MSN Messenger for one hour and then I logged into like various different routes that you could use as well.
[00:27:39.640 --> 00:27:40.840] Just use it online.
[00:27:41.320 --> 00:27:42.520] It didn't do anything.
[00:27:42.520 --> 00:27:45.400] It just took me a little bit longer to get access to it.
[00:27:45.400 --> 00:27:47.560] Because teenagers are the best at figuring that stuff out.
[00:27:47.880 --> 00:27:48.920] Digital natives, yeah.
[00:27:48.920 --> 00:27:51.240] And they don't even necessarily need to understand it.
[00:27:51.240 --> 00:27:55.160] They just need to know someone who knows someone who knows someone who understands it, right?
[00:27:55.160 --> 00:27:57.000] Or be capable of Googling.
[00:27:57.000 --> 00:27:59.800] Or be capable of asking AI to tell them.
[00:28:00.040 --> 00:28:02.280] Chat GPT, how do I circumvent this?
[00:28:02.440 --> 00:28:02.840] Oh, cool.
[00:28:02.840 --> 00:28:03.560] Thanks.
[00:28:03.560 --> 00:28:09.000] Those same age verification overlays are also only shown to UK users.
[00:28:09.000 --> 00:28:18.400] So if you fire up your copy of ZoomyZoom VPN, you can access exactly the same content without the age verification checks by just setting your country to Sweden or whatever.
[00:28:18.720 --> 00:28:21.920] Is this story also sponsored by Zoomizoom VPN?
[00:28:21.920 --> 00:28:27.760] It is sponsored by Zoomizum VPN in case you want to wank.
[00:28:28.320 --> 00:28:37.840] But we know from experience that every fucking podcast in the world is currently ramming NordVPN and Express VPN and Proton VPN and all that kind of stuff down your throat.
[00:28:37.840 --> 00:28:43.200] And explicitly advertising it is a way to bypass the geographic blocking of content.
[00:28:43.200 --> 00:28:50.960] Now they're saying it's to get around Netflix geo blocks and things like that, but it works exactly the same way for evading age checks on porn.
[00:28:50.960 --> 00:28:52.960] Are you allowed to make that point?
[00:28:52.960 --> 00:28:54.960] Because I haven't fully read the legislation.
[00:28:54.960 --> 00:28:56.800] I read some of the analysis of it here and there.
[00:28:56.800 --> 00:29:07.200] And part of the thing it was suggesting was it is no longer legal for websites that would be subject to the age verification to list information about how to use a VPN.
[00:29:07.520 --> 00:29:08.480] I don't know.
[00:29:08.480 --> 00:29:09.600] I've got no idea.
[00:29:09.600 --> 00:29:10.080] Okay.
[00:29:10.080 --> 00:29:22.240] But what I did do was I checked the Google Trends data for the UK, and there has been a four-fold increase in searches for the phrase VPN since these provisions came into force.
[00:29:22.480 --> 00:29:27.840] So just in the last week, there is four times as many searches for VPN as there were before that.
[00:29:28.000 --> 00:29:41.200] And that's not just people looking to access things that they shouldn't quote-unquote be accessing, but also people who just don't want to give the credit card details to companies or have their face scanned for companies.
[00:29:41.200 --> 00:29:45.200] So circumvention of these checks is trivial in many cases.
[00:29:45.200 --> 00:29:56.320] Research from Stanford University, which was published in pre-print in March, found that the introduction of age verification laws in some US states resulted in a 24% increase in searches for VPNs.
[00:29:56.320 --> 00:30:06.440] But more worryingly, they also observed a 48% increase in searches for porn sites that were known to not comply with age verification laws.
[00:30:07.400 --> 00:30:17.720] And if they're not complying with some laws, then what else are they not complying with in an industry that is rampant with problems and workers who aren't being adequately protected?
[00:30:17.720 --> 00:30:18.280] Absolutely.
[00:30:18.280 --> 00:30:25.560] So while sites like Pornhub and RedTube and whatever have got these age verifications in place, not all porn sites have.
[00:30:25.560 --> 00:30:28.120] Some of them are just ignoring the law.
[00:30:28.120 --> 00:30:38.360] So while the reputable porn sites, quote-unquote reputable porn sites, like Pornhub might be cooperating with the government on this one, the less reputable websites are not.
[00:30:38.360 --> 00:30:49.480] And the less reputable websites who are less concerned with following the law are where you're going to encounter exactly the kind of harmful and even illegal content that this law was supposed to prevent access to.
[00:30:49.480 --> 00:30:58.760] So the effect of these age verification laws can actually drive consumers, including children, toward harmful images and videos.
[00:31:00.040 --> 00:31:12.280] So we have then good empirical evidence which reveals a significant disconnect between the policy objectives of age verification and the practical outcomes of age verification.
[00:31:12.280 --> 00:31:15.000] And how this plays out within the UK remains to be seen.
[00:31:15.000 --> 00:31:29.880] We don't know how people are going to respond to this yet, but I'm skeptical that this will have the intended effect and may in fact have the opposite effect of driving curious children towards less regulated platforms that have fewer safeguards in place.
[00:31:30.840 --> 00:31:45.680] There's also significant demographic biases in the age verification of non-white faces, or for people, particularly young adults, who are more likely to fail face age verification, have no credit card, and have no government issued ID, and they just can't access this stuff now.
[00:31:46.080 --> 00:32:00.880] It's a major issue for people from all sorts of different backgrounds who can't get access to credit, you know, home on how to people, as you say, people on low income who might not be able to get people who've been in debt before who aren't allowed to get a credit card.
[00:32:01.280 --> 00:32:03.440] How many people does that penalize?
[00:32:03.440 --> 00:32:10.160] Yeah, literally right now, neither my parents possess photo ID, neither of them possess credit cards.
[00:32:10.400 --> 00:32:14.000] And even my mum's boyfriend doesn't have either of those things, I don't think.
[00:32:14.000 --> 00:32:16.240] Maybe he's got a photo driving license, maybe.
[00:32:16.240 --> 00:32:17.760] But certainly my mum wouldn't be able to do it.
[00:32:17.760 --> 00:32:18.640] My dad wouldn't be able to do it.
[00:32:18.720 --> 00:32:21.520] So, like, and they're in their 60s at this point.
[00:32:21.840 --> 00:32:24.160] And we haven't touched on this aspect yet.
[00:32:24.160 --> 00:32:27.680] When you prove your age, you need to do that with some sort of login.
[00:32:29.200 --> 00:32:30.640] You don't do it every time, right?
[00:32:30.640 --> 00:32:33.120] You do it once, and then it's saved against your account.
[00:32:33.120 --> 00:32:40.800] So now you need to sign up for Pornhub or Red Tube or whatever and actually log in to be able to access that content.
[00:32:40.800 --> 00:32:48.320] And while we have talked about the social stigma of having your name associated with porn, there is also the further risk of a data breach on these services.
[00:32:48.320 --> 00:32:54.880] And now your verified personal details and a list of videos you've been watching can be used to blackmail or extort you.
[00:32:54.880 --> 00:32:55.520] Yeah.
[00:32:56.480 --> 00:32:59.360] But we're still not done yet.
[00:32:59.360 --> 00:33:18.560] Because while most of this conversation is around the Online Safety Act is being directed around pornography, the Online Safety Act is not just about porn because there are other kinds of quote-unquote harmful content, which there are legitimate concerns around, like self-harm or pro-anna content or other eating disorder content and that sort of stuff.
[00:33:18.560 --> 00:33:25.520] But the way the law is phrased places a significant regulatory burden on web platforms.
[00:33:25.520 --> 00:33:31.160] So specifically, these provisions apply to what they refer to as user-to-user services.
[00:33:29.840 --> 00:33:36.040] So that's websites where users publish content and other users consume it.
[00:33:36.600 --> 00:33:41.720] So Twitter, Facebook, Discord, Reddit, forums, chat rooms, that sort of thing.
[00:33:42.360 --> 00:33:58.280] And the law requires that operators of those sites have moderation procedures in place to monitor and remove illegal content and also to prevent children from accessing harmful content, even if that content is legal for adults.
[00:33:58.600 --> 00:34:04.680] And that's probably fine when you're Reddit or Discord because you have moderators in place already.
[00:34:04.680 --> 00:34:10.680] But there are also smaller sites operated by individuals which fall into these categories.
[00:34:11.000 --> 00:34:18.280] So furry.energy was a small Mastodon server targeted at furries, particularly within the LGBTQ community.
[00:34:18.280 --> 00:34:29.720] The owners of that website shut it down in March because the volunteer owner was not confident that they had the resources to police the content to the extent required by law and couldn't afford the fines if they made a mistake.
[00:34:29.720 --> 00:34:35.960] The fines for this are up to 18 million pounds or 10% of your worldwide gross revenue.
[00:34:35.960 --> 00:34:37.080] Yeah, yeah, yeah.
[00:34:37.080 --> 00:34:40.120] So the fines are absolutely enormous.
[00:34:40.120 --> 00:34:44.840] Dads with Kids, which is a forum for single fathers, closed for the same reason.
[00:34:44.840 --> 00:34:47.880] Moderators say we can't afford the fines if we get this wrong.
[00:34:47.880 --> 00:34:52.440] Green Living, a forum about sustainable living, again, closed for the same reason.
[00:34:52.440 --> 00:34:54.920] We can't afford the fines if we get this wrong.
[00:34:54.920 --> 00:35:00.840] Genuinely, the question would be: there is a Discord associated with skeptics in the pub online.
[00:35:00.840 --> 00:35:01.240] Yeah.
[00:35:01.240 --> 00:35:02.120] Is that under risk?
[00:35:02.120 --> 00:35:02.840] Is that under threat?
[00:35:02.840 --> 00:35:04.920] Genuinely, well, we paused this short to it.
[00:35:04.920 --> 00:35:06.280] I don't know whether it would be.
[00:35:06.280 --> 00:35:11.240] It shouldn't be because Discord have moderators in place for that sort of stuff.
[00:35:11.240 --> 00:35:13.320] So you have to go to the big platforms now.
[00:35:13.320 --> 00:35:14.960] You can't just have old forums.
[00:35:14.960 --> 00:35:19.360] So there was a skeptics forum, not like UK Skeptics Forum for a while.
[00:35:14.760 --> 00:35:19.520] Yeah.
[00:35:19.760 --> 00:35:24.480] That would be sensible to close down because it will be massively under this legislation.
[00:35:24.720 --> 00:35:25.360] Yeah.
[00:35:25.680 --> 00:35:33.600] And these kind of forums and chat rooms are unlikely to have been hosting the kind of harmful and illegal content that this legislation was meant to address.
[00:35:33.600 --> 00:35:40.960] But the required risk assessments and the harsh penalties were enough to chill those users into closing down their websites.
[00:35:40.960 --> 00:35:46.160] And this disproportionately impacts niche groups like lesbian, gay, and trans groups.
[00:35:46.160 --> 00:35:58.400] And has the knock-on effect of further centralizing power with large online platforms who can afford moderators, but often can afford moderators because they're harvesting customers' personal information to sell them ads.
[00:35:58.400 --> 00:35:59.200] Yes, yeah, yeah.
[00:35:59.200 --> 00:36:08.480] And the reason it particularly harms niche groups is because just being gay is considered, could be easily be pulled into the consideration of that's mature content.
[00:36:08.480 --> 00:36:08.800] Absolutely.
[00:36:08.880 --> 00:36:13.280] The discussion of being gay counts in that case as something we need to protect children from.
[00:36:13.280 --> 00:36:14.320] Yes, it does.
[00:36:14.640 --> 00:36:22.160] So what we're doing here is further exacerbating one problem in the name of solving another problem and solving it ineffectually.
[00:36:22.160 --> 00:36:36.800] So while these systems may satisfy the political demands to be seen to be doing something about online child safety, the research actually consistently demonstrates that these mechanisms fail to meaningfully protect children.
[00:36:37.120 --> 00:36:46.240] Not only that, but we're actually seeing even amongst the large providers the age gating of all content that they deem mature, even if it is not harmful.
[00:36:46.240 --> 00:36:47.680] Speaking to your point, Marsh.
[00:36:47.680 --> 00:36:48.160] Yeah.
[00:36:48.160 --> 00:36:57.840] Reddit, for example, has put age verification demands on support forums for sexual assault survivors, for queer and trans discussion forums, for eating disorder support forums.
[00:36:57.840 --> 00:37:11.080] And so now teenagers are being excluded from those spaces by these age gates, despite teenagers being at greatest risk, for example, of developing an eating disorder and most in need of that kind of support.
[00:37:12.040 --> 00:37:19.640] Also caught in the middle of this is Wikipedia, because Wikipedia lets some users write content and other users read content.
[00:37:19.640 --> 00:37:40.680] So it technically qualifies as a user-to-user service under the terms of the Online Safety Act, meaning that Wikipedia may be required to prevent children from seeing quote-unquote harmful content, which could mean placing age verification walls on certain articles, requiring users to log in to be able to read Wikipedia, that sort of thing.
[00:37:41.000 --> 00:37:44.200] Wikipedia have proactively taken action to prevent this.
[00:37:44.200 --> 00:37:54.680] They've said we are not going to do any of these things, and they've brought this to judicial review to ensure that they are not classified in such a way that means they will be expected to put these moderation procedures in place.
[00:37:54.680 --> 00:38:01.720] Because when people turn to ChatGPT to answer search queries anyway, that kills Wikipedia in the UK.
[00:38:01.720 --> 00:38:05.240] People will just stop using it if you have to log in to find stuff on Wikipedia.
[00:38:05.240 --> 00:38:08.360] You'll just go to ChatGPT and ask a question instead.
[00:38:08.360 --> 00:38:09.800] And it's not just Wikipedia.
[00:38:09.800 --> 00:38:15.320] WhatsApp, Signal, iMessage, Telegram, Session, all these encrypted messaging services.
[00:38:15.320 --> 00:38:25.960] The Online Safety Act requires that providers of end-to-end encrypted communication provide the British government with a back door to access the encrypted data, just in case it's something harmful.
[00:38:25.960 --> 00:38:35.480] Just in case it's terrorism or child sex abuse or whatever, they have to have the means to access everyone's encrypted private communications just in case.
[00:38:35.480 --> 00:38:43.800] Yeah, it's essentially the digital version of the police are allowed to strip you naked on the street just in case you're concealing something that they need to be worried about.
[00:38:43.800 --> 00:38:47.360] When Theresa May's government tried to introduce this, it was referred to as the Snoopers Charter.
[00:38:47.360 --> 00:38:47.760] Yeah.
[00:38:47.760 --> 00:38:49.360] And that's very much how it's being used.
[00:38:49.360 --> 00:38:57.440] It's what's surprising to me is it's being used this way by a Labour government when the Tories actually didn't do anything with this, even after it had passed.
[00:38:57.440 --> 00:38:57.760] Yeah.
[00:38:57.760 --> 00:39:01.040] I wish it was surprising to me currently.
[00:39:01.040 --> 00:39:12.720] But this has caused a terrific amount of disquiet in the software community because those end-to-end encrypted systems are designed in such a way that even the operator of the system cannot access the content.
[00:39:12.720 --> 00:39:18.640] Apple have already withdrawn some of their newer privacy products from the UK because of this legislation.
[00:39:18.640 --> 00:39:22.320] Not specifically the Online Safety Act, but it's related to it.
[00:39:22.320 --> 00:39:32.880] So their advanced data protection service was withdrawn from the UK earlier this year after the UK government demanded that they modify it to allow law enforcement to access encrypted data.
[00:39:32.880 --> 00:39:39.840] Apple's argument, Apple's response to that, which is a valid argument, is there is no such thing as a backdoor for the good guys only.
[00:39:39.840 --> 00:39:40.480] Yes, yeah.
[00:39:40.480 --> 00:39:42.480] Once it exists, it can be exploited.
[00:39:42.480 --> 00:39:50.640] And how would the British government feel about the government of a hostile foreign power demanding access to the encrypted communications of Keir Starmer?
[00:39:50.640 --> 00:39:51.040] Yeah.
[00:39:51.040 --> 00:39:56.080] You know, because it would be exactly the same mechanisms that would be used for that.
[00:39:56.960 --> 00:40:08.800] Now, the government has promised that it will not try to enforce the provisions of the Online Safety Act that require companies to break encryption until it is technically feasible for them to do so.
[00:40:09.120 --> 00:40:11.280] But that is also a promise.
[00:40:11.280 --> 00:40:12.320] Yes, bullshit.
[00:40:12.320 --> 00:40:16.720] There's no safety mechanisms in the legislation that will hold the government to that.
[00:40:16.720 --> 00:40:20.080] They're just saying, we swear we won't do anything about this.
[00:40:20.080 --> 00:40:25.520] And that's just a policy change away from them starting to sue people for using encrypted chats.
[00:40:25.520 --> 00:40:29.600] Yeah, especially when what they're asking for is almost definitionally not possible.
[00:40:29.600 --> 00:40:29.960] Yes.
[00:40:29.960 --> 00:40:30.520] So yeah, we will.
[00:40:30.600 --> 00:40:33.320] Because it's not end-to-end encrypted anymore if there's a backdoor into it.
[00:40:29.680 --> 00:40:36.760] Yeah, we will not enforce this policy until unicorns are real.
[00:40:37.000 --> 00:40:37.480] Yes.
[00:40:37.480 --> 00:40:39.320] You're not going to stick to that.
[00:40:39.320 --> 00:40:53.160] Various online providers have said that they would sooner withdraw from the UK market rather than compromise their own products, which means that these requirements, if enforced, would effectively outlaw encrypted communication in the UK.
[00:40:53.480 --> 00:40:54.760] Which is a lot, right?
[00:40:54.760 --> 00:40:55.320] Yeah.
[00:40:55.640 --> 00:40:59.720] So the last thing I want to talk about is the problem itself.
[00:41:00.040 --> 00:41:07.800] It is true that we are facing a generation of children who have instant access to the most extreme forms of pornography.
[00:41:07.800 --> 00:41:11.160] When I was a kid, porn was something you found in a bush in the park.
[00:41:11.160 --> 00:41:11.480] Yes.
[00:41:11.480 --> 00:41:11.960] Right?
[00:41:11.960 --> 00:41:16.200] And it amounted to a lady with big boobs smiling while looking dead behind the eyes.
[00:41:16.600 --> 00:41:18.840] That was what porn in the 90s was.
[00:41:18.840 --> 00:41:21.000] And it's the only thing you can get off to now.
[00:41:22.200 --> 00:41:24.920] It just had a deep effect on the only thing that works.
[00:41:25.240 --> 00:41:34.360] But even when I was in my late teens and able to access internet pornography for the first time, internet pornography was a very different beast in late 90s, early 2000s.
[00:41:34.360 --> 00:41:39.480] Content featuring anal sex was like the hardcore stuff, was the scary stuff.
[00:41:39.480 --> 00:41:42.920] And now anal is treated as like a given as table stakes.
[00:41:42.920 --> 00:41:47.000] Well, obviously you do anal, but what do you do for the really hard stuff?
[00:41:47.640 --> 00:41:59.000] There has also been what you might refer to as the BDSMification of porn, where porn that would have been considered niche BDSM content is now considered mainstream content.
[00:41:59.000 --> 00:42:19.440] So having faces slapped and hair-pulled, and degradation, and name-calling, which was once the preserve of the FET community and enacted within a framework of negotiation and understanding and consent and safeguards and give and take, is now presented without any of that framework, at least in how it's depicted on screen, as if this is what sex is normally like.
[00:42:20.640 --> 00:42:30.000] And we are seeing this translating to how teenagers, especially young women, are being treated by their partners during their first tentative sexual experiences.
[00:42:30.000 --> 00:42:32.400] This is not an imaginary problem.
[00:42:32.400 --> 00:42:40.240] The immediate access to porn seems to be the proximate cause of this change in teenagers' shift in attitudes.
[00:42:40.640 --> 00:42:42.320] Certainly one of the proximate causes.
[00:42:42.320 --> 00:42:43.040] Yes.
[00:42:43.360 --> 00:42:50.720] But I don't think that gating adult content in the way that the Online Safety Act tries to is going to be an effective counter to this.
[00:42:51.040 --> 00:42:55.360] What I think would be an effective counter to this is improved sex education.
[00:42:56.320 --> 00:43:00.800] Kids aren't watching RoboCop and thinking that's how policing works, right?
[00:43:00.800 --> 00:43:03.920] They understand the difference between fantasy and reality.
[00:43:03.920 --> 00:43:13.360] And what they need to understand is that porn represents a fantasy world that is not reflective of real sex or typical sexual relationships.
[00:43:13.680 --> 00:43:25.840] But ironically, the pearl clutchers who have pushed for this kind of legislation over the last decade or so are also the same ones who argue against comprehensive sex education about sex and sexuality in schools.
[00:43:25.840 --> 00:43:26.240] Yeah, absolutely.
[00:43:26.400 --> 00:43:28.080] Because you can't teach kids about that.
[00:43:28.320 --> 00:43:29.680] They're not ready for it.
[00:43:29.680 --> 00:43:36.880] Which means curious kids will go out and seek the information for themselves and find what they think are the answers in porn.
[00:43:37.920 --> 00:43:39.360] And there's data to support this.
[00:43:39.360 --> 00:43:44.560] So one pilot study from 2020 ran a porn literacy program with teenagers.
[00:43:44.560 --> 00:43:49.600] They taught those teenagers to spot the scripted and unrealistic stuff in porn.
[00:43:49.600 --> 00:43:53.360] They taught them to understand that real relationships don't look like this.
[00:43:53.360 --> 00:44:01.080] And they did that while framing it in a non-judgmental way and without saying you should be discouraged from watching porn.
[00:44:02.200 --> 00:44:14.280] And after taking part in that program, the teens were less likely to buy into harmful stereotypes that porn promotes and they were less likely to buy into the idea that certain behaviors are normal or expected.
[00:44:14.280 --> 00:44:28.600] So from the paper, quote, participants learned that not everyone enjoys being called names like slut during sex and they need to ask partners for consent before each new sexual act that they may want to try during a sexual encounter.
[00:44:28.600 --> 00:44:40.040] For example, anal sex, hair pulling, and spanking would each require separate consent and may not be as widely enjoyed by women as they might presume after having watched mainstream pornography.
[00:44:40.040 --> 00:44:41.320] Unquote.
[00:44:41.960 --> 00:44:56.040] What this paper found was that before the workshop, 44% of those attending said that watching porn made them want to emulate what they saw, and 26% of those attending said they thought porn was a realistic depiction of sex.
[00:44:56.360 --> 00:45:04.440] After the workshop, those numbers reduced to 29% who wanted to emulate what they had seen, down from 44%.
[00:45:04.440 --> 00:45:09.640] And the number who thought that porn was a realistic depiction of sex dropped to zero.
[00:45:09.960 --> 00:45:15.800] So after the workshop, none of them agreed was a realistic depiction of sex.
[00:45:16.440 --> 00:45:18.840] They also looked at adverse events.
[00:45:18.840 --> 00:45:31.640] So one of the adverse events they looked at was whether the teenagers sought to access more porn after the workshop, because that's a common objection to these kind of workshops: telling teenagers about porn will make them want to go and seek it out.
[00:45:31.640 --> 00:45:37.480] But in follow-up interviews, they found that the number who reported using porn did not change before and after the session.
[00:45:37.640 --> 00:45:38.520] Of course, it wouldn't change.
[00:45:38.520 --> 00:45:40.360] Teenagers want to see porn.
[00:45:40.600 --> 00:45:42.520] It doesn't matter whether you give them access or not.
[00:45:42.520 --> 00:45:42.920] They want to.
[00:45:42.920 --> 00:45:45.000] That's the whole fucking problem here, isn't it?
[00:45:45.520 --> 00:45:56.000] Nor did any of those involved report feeling anxious or upset by the content of the workshop, which is the other thing that, you know, maybe if you're a sexual abuse survivor or something like that, that might be a difficult thing for you to do.
[00:45:56.000 --> 00:45:59.520] None of those adverse events were reported in this paper.
[00:45:59.520 --> 00:46:04.800] Now, this paper has got very small numbers, 31 participants, no control group.
[00:46:04.800 --> 00:46:06.800] The follow-up was quite short-term.
[00:46:06.800 --> 00:46:09.040] Good reasons to be skeptical of these figures.
[00:46:09.040 --> 00:46:16.880] But my understanding, researching around this topic for the show, is that these findings are not untypical in this sort of research.
[00:46:16.880 --> 00:46:23.680] Many, many other papers find very similar things with porn literacy programs, is that they have these sorts of effects.
[00:46:24.320 --> 00:46:34.320] So while the Online Safety Act attempts to tackle the problem of children accessing harmful content, the tools and methods it relies on are flawed and prone to failure.
[00:46:34.320 --> 00:46:40.560] Age verification technology struggles with accuracy and with bias and can be easily circumvented.
[00:46:40.560 --> 00:46:52.960] The legislation's impact on smaller and often vulnerable communities and platforms risks further centralization of power in the hands of a few large companies, often at the cost of privacy and accessibility.
[00:46:52.960 --> 00:47:04.160] And more importantly, the focus on blocking access misses the bigger picture, which is children need better education to understand what they are seeing and to navigate their own sexuality safely.
[00:47:04.160 --> 00:47:14.880] The evidence, limited though it is, suggests that porn literacy programs reduce harmful misconceptions without causing the unintended harms of censorship and surveillance.
[00:47:15.520 --> 00:47:19.920] So in the end, I think the Online Safety Act is tackling the wrong problem.
[00:47:19.920 --> 00:47:29.760] The question isn't how to keep teenagers away from porn, it's how to prepare them to engage with the real world where sex and relationships are more complex than what they're going to see on the screen.
[00:47:30.120 --> 00:47:35.160] And any law that tries to simply slam the door shut is going to fall short.
[00:47:39.320 --> 00:47:40.920] I went to see Superman.
[00:47:40.920 --> 00:47:41.720] Oh, right, yes.
[00:47:41.720 --> 00:47:43.160] I went to see the new Superman film.
[00:47:43.160 --> 00:47:47.080] I've long since given up on DC films because they've all been shit for a long time.
[00:47:47.080 --> 00:47:50.040] They have, and I don't particularly like Superman as an idea.
[00:47:50.040 --> 00:47:51.160] I don't find it interesting.
[00:47:51.160 --> 00:47:53.560] He's just a man who does almost everything and is in Rumble.
[00:47:53.720 --> 00:47:54.040] Yeah.
[00:47:54.280 --> 00:48:04.600] And there's been a problem with Superman, and this problem has been since the Christopher Reed version of Superman, is that Superman is absurdly overpowered to the point where you've got two plots in Superman, right?
[00:48:04.600 --> 00:48:07.400] You've got Kryptonite and Bad Superman.
[00:48:07.400 --> 00:48:07.720] Yep.
[00:48:07.720 --> 00:48:09.720] And those are the only two plots that you've got.
[00:48:09.720 --> 00:48:11.240] You've got Kryptonite and Bad Superman.
[00:48:11.240 --> 00:48:13.960] So in the Chris Reeve film, Superman 1, Kryptonite.
[00:48:13.960 --> 00:48:15.720] Superman 2, Bad Superman.
[00:48:15.960 --> 00:48:17.720] Superman 3, Bad Superman.
[00:48:17.720 --> 00:48:19.560] Superman 4, Bad Superman.
[00:48:19.560 --> 00:48:20.680] That's the only plot you've got.
[00:48:21.480 --> 00:48:25.320] The Brandon Routh one with Kevin Spacey in it, that was Kryptonite.
[00:48:25.320 --> 00:48:25.800] Don't remember.
[00:48:26.680 --> 00:48:30.440] It's the Man of Steel, the Henry Capel one, Bad Superman.
[00:48:30.440 --> 00:48:30.760] Didn't watch it.
[00:48:31.160 --> 00:48:31.480] That's it.
[00:48:31.480 --> 00:48:32.200] You've got two plots.
[00:48:34.200 --> 00:48:35.320] Which one is this?
[00:48:35.320 --> 00:48:39.480] It turns out, I don't want to put spoilers in for Superman, but it turns out it's both.
[00:48:42.200 --> 00:48:46.200] So, Superman's a very overpowered character, and that's a problem.
[00:48:46.200 --> 00:48:52.120] And they recognized this in the comics, and they did a long-running comics plotline, which I think is Crisis on Infinite Earth.
[00:48:52.280 --> 00:48:59.800] I might be wrong, I'm not a DC comics guy, where they significantly nerfed Superman because they recognized he was too powerful and it made for boring plots.
[00:49:01.320 --> 00:49:05.400] This happened after the Chris Reeve film had been done.
[00:49:05.400 --> 00:49:14.960] And so, movie Superman is still the ridiculously overpowered Superman from the 70s, even though the comics have kind of backed him off because they realized it made for shit stories.
[00:49:14.440 --> 00:49:16.320] And that's carried through.
[00:49:16.400 --> 00:49:24.720] It carried through all the way through to the Henry Cavill Superman, who again is ridiculously overpowered, probably even more so than the Christopher Reeve one.
[00:49:24.720 --> 00:49:27.360] But for the new one, they've really backed it off.
[00:49:27.360 --> 00:49:27.920] Okay.
[00:49:27.920 --> 00:49:30.320] So he's much more vulnerable character.
[00:49:30.320 --> 00:49:34.640] The world is full of other superhero characters who give him a serious run for his money.
[00:49:34.640 --> 00:49:37.040] And it's a much more fun Superman as well.
[00:49:37.040 --> 00:49:43.920] Superman's always portrayed as very po-faced and very, you know, it's all very serious and it's very serious business being Superman.
[00:49:43.920 --> 00:49:45.440] And this is a fun Superman.
[00:49:45.440 --> 00:49:50.640] So after having not watched any of the DC, and I wasn't planning to watch this one until the reviews actually said this was quite good and crazy.
[00:49:50.800 --> 00:49:51.920] I saw the reviews thought it was decent.
[00:49:52.400 --> 00:49:56.080] So I went out and watched Nicholas Holt is really good as Lex Luther.
[00:49:56.080 --> 00:49:57.120] He's good in lots of things.
[00:49:57.120 --> 00:49:58.080] He's fantastic in it.
[00:49:58.080 --> 00:50:01.600] He was beast in the young cast of X-Men.
[00:50:01.600 --> 00:50:03.280] Yeah, and he was in skins, wasn't he?
[00:50:03.280 --> 00:50:03.600] Wasn't he?
[00:50:04.400 --> 00:50:06.880] I thought he could start in skins, I think.
[00:50:06.880 --> 00:50:08.480] But yeah, I really enjoyed Superman.
[00:50:08.480 --> 00:50:09.040] It was good.
[00:50:09.040 --> 00:50:14.960] And it was silly and fun in a way that Superman just hasn't been in the movies for a long time.
[00:50:14.960 --> 00:50:16.560] So I definitely recommend it.
[00:50:17.120 --> 00:50:18.000] It was worth going.
[00:50:18.000 --> 00:50:18.800] I went with Emma.
[00:50:18.800 --> 00:50:20.320] We went to see it at The Fact.
[00:50:20.480 --> 00:50:22.480] And there was no one else went to see it at The Fact.
[00:50:22.640 --> 00:50:23.760] That's often the case with facts.
[00:50:23.760 --> 00:50:24.240] I like facts.
[00:50:24.880 --> 00:50:30.960] When Emma booked the tickets, there was just one person that had already booked a seat and they booked a kind of middle-middle seat.
[00:50:31.600 --> 00:50:35.200] And so we did toy with the idea of booking the seat either side of them.
[00:50:35.360 --> 00:50:36.160] Toward it was good.
[00:50:37.520 --> 00:50:38.400] She fucked them off.
[00:50:38.560 --> 00:50:39.920] They wouldn't let you do that in one booking.
[00:50:39.920 --> 00:50:41.840] I think you'd have to do two bookings for that.
[00:50:42.560 --> 00:50:44.000] Don't really fuck this person off.
[00:50:44.400 --> 00:50:45.600] It'll be a laugh.
[00:50:45.600 --> 00:50:51.600] But for some reason, almost every advert, and I think this is a fact thing rather than a Superman thing.
[00:50:51.600 --> 00:50:53.760] So many adverts were for cars.
[00:50:53.800 --> 00:50:55.920] Yeah, well, they're sponsored by Kia, aren't they?
[00:50:56.080 --> 00:50:57.680] Pidger House is sponsored by Kia.
[00:50:57.680 --> 00:51:01.400] So there was the Kia one that was very close to the film, right?
[00:50:59.920 --> 00:51:07.080] So you had the generic ads, then trailers, then the Kia ad, and then the films.
[00:51:07.240 --> 00:51:08.280] Yes, yeah, yeah, that's what we have.
[00:51:09.160 --> 00:51:11.640] A lot of the generic ads were fucking cars as well.
[00:51:11.640 --> 00:51:24.760] It's like, well, when it used to be for films, it was all cars and perfume, and just maybe cars of car companies are the only ones with the money left for to be able to do cinema-based advertising these days, you know.
[00:51:24.760 --> 00:51:26.360] But yeah, it was so, but it was fun.
[00:51:26.360 --> 00:51:26.840] I enjoyed it.
[00:51:26.840 --> 00:51:29.960] It was, I would definitely recommend going to see going to see Superman.
[00:51:29.960 --> 00:51:33.320] Not as good as 28 years later, which I think might be my favorite film.
[00:51:33.320 --> 00:51:34.280] I've not seen it.
[00:51:35.080 --> 00:51:36.440] I could not get Nicol to watch it.
[00:51:36.440 --> 00:51:38.840] No, she's not interested at all.
[00:51:39.160 --> 00:51:41.400] Yeah, I will watch it at some point.
[00:51:41.400 --> 00:51:42.040] It's a good film.
[00:51:42.360 --> 00:51:43.000] We'll watch it with you.
[00:51:43.000 --> 00:51:43.960] I've not been to cinema.
[00:51:43.960 --> 00:51:47.800] I did watch the women's Euros, which has been very, very exciting.
[00:51:47.800 --> 00:51:48.680] Yes, I saw that.
[00:51:48.680 --> 00:51:50.040] Was lots of people talking about that.
[00:51:50.040 --> 00:51:51.320] Lots of people at work excited about that.
[00:51:51.480 --> 00:51:52.280] Yeah, it was excellent.
[00:51:52.280 --> 00:51:53.880] So England has won again.
[00:51:53.880 --> 00:51:55.160] So they've won twice.
[00:51:55.160 --> 00:51:56.840] They won it back-to-back, basically.
[00:51:56.840 --> 00:51:57.400] 2022.
[00:51:57.560 --> 00:51:58.440] Which does not happen.
[00:51:58.440 --> 00:52:01.880] Yeah, which should have been the 2021 tournament, but it was delayed by.
[00:52:01.880 --> 00:52:05.880] And then three years later, they won it this year, which was very good.
[00:52:05.880 --> 00:52:06.840] It was very exciting.
[00:52:06.840 --> 00:52:13.240] Lots of extra time, lots of kind of like tense kind of games, lots of going behind the quarterfinals and the semifinals.
[00:52:13.240 --> 00:52:20.040] Across the quarters, the semis, and the final, England were in the lead, cumulatively, for four and a half minutes.
[00:52:20.040 --> 00:52:20.520] Okay.
[00:52:20.520 --> 00:52:23.160] They played 360 minutes in those three games.
[00:52:23.160 --> 00:52:25.400] Cumulatively, they were ahead for four and a half minutes.
[00:52:25.640 --> 00:52:28.280] That's how kind of exciting and mad the whole thing was.
[00:52:28.280 --> 00:52:31.080] But the most, the maddest thing about it, I actually love the team.
[00:52:31.080 --> 00:52:32.680] I absolutely love so many of the players in it.
[00:52:32.680 --> 00:52:38.200] Lucy Bronze, the right back, is ridiculously, impressively nails.
[00:52:38.200 --> 00:52:41.320] She's hard as nails, like insanely sore, ridiculously sore.
[00:52:41.560 --> 00:52:45.520] Right down to the point where Tough is very literally her middle name.
[00:52:44.680 --> 00:52:48.800] She was born Lucia Roberta Tough Bronze.
[00:52:50.320 --> 00:52:54.960] But she went off with an injury in the second half and halftime in extra time.
[00:52:54.960 --> 00:53:03.200] So she played the 90 minutes and then 15 minutes and she kept going down with an injury to a leg and she's literally limping around while the ball is out of play.
[00:53:03.200 --> 00:53:04.880] And then when the ball's in place, she's running again.
[00:53:04.880 --> 00:53:05.920] You're like, what are you doing?
[00:53:05.920 --> 00:53:06.400] Just go off.
[00:53:06.400 --> 00:53:07.440] This is crazy.
[00:53:07.440 --> 00:53:13.440] And then in the interview at the end, they said to her, oh, and you had that injury to your right leg.
[00:53:13.440 --> 00:53:20.800] And you were, she's during the celebrations and everything, she's fully leg strapped up and she's hopping on her left leg because she can't bend her right leg anymore.
[00:53:20.800 --> 00:53:22.000] It's completely nagged.
[00:53:22.000 --> 00:53:24.640] And she said, they're saying, oh, so you're how's the how's the leg?
[00:53:24.640 --> 00:53:25.200] Is it all right?
[00:53:25.280 --> 00:53:27.520] She says, yeah, it's not, it's, it's bad.
[00:53:27.520 --> 00:53:33.120] But what I didn't say was throughout this entire tournament, I've been playing with a broken fibia in the other leg.
[00:53:33.120 --> 00:53:34.240] In the other leg?
[00:53:34.240 --> 00:53:36.560] So she's tibia, sorry.
[00:53:36.560 --> 00:53:40.800] So she's played the entirety of the tournament and won the tournament with a fractured tibia.
[00:53:40.800 --> 00:53:41.280] Fucking hell.
[00:53:42.160 --> 00:53:44.240] And they were like, wow, that must have been painful.
[00:53:44.640 --> 00:53:47.920] Yeah, it was incredibly painful constantly every single second of the game.
[00:53:48.320 --> 00:53:51.840] So she's now done a knee in the uninjured leg.
[00:53:51.840 --> 00:53:56.000] Presumably because she's been carrying a weight funny.
[00:53:56.000 --> 00:53:58.240] And when they said that, she was like, she laughed.
[00:53:58.240 --> 00:54:00.480] She was like, yeah, it's been insanely painful.
[00:54:00.720 --> 00:54:01.760] What is wrong with you?
[00:54:01.760 --> 00:54:02.880] You are not human.
[00:54:02.880 --> 00:54:03.280] Yeah.
[00:54:03.280 --> 00:54:04.640] Incredible.
[00:54:08.160 --> 00:54:15.920] So for QED, we've got the announcement of two new podcasts that are going to be appearing in the QED podcast room at QED.
[00:54:15.920 --> 00:54:19.360] The first of those is, of course, going to be Incredulous.
[00:54:19.360 --> 00:54:23.440] Incredulous has been on the QED main stage for the whole run of QED.
[00:54:23.440 --> 00:54:26.640] It has every year of QED so far, right up until the last one.
[00:54:27.600 --> 00:54:28.760] We're going to have an Incredulous.
[00:54:28.800 --> 00:54:30.280] Yep, which should be very, very fun.
[00:54:30.280 --> 00:54:31.160] It's always a good time.
[00:54:31.160 --> 00:54:32.120] It's always fun.
[00:54:32.120 --> 00:54:33.240] It's going to be a fantastic time.
[00:54:33.240 --> 00:54:34.360] We're looking forward to that.
[00:54:29.840 --> 00:54:36.520] And that, of course, is going to be hosted by Andy Wilson.
[00:54:36.760 --> 00:54:39.160] So you should definitely come along and watch that.
[00:54:39.160 --> 00:54:42.520] The other podcast that we've got is the No Rogan Experience.
[00:54:42.520 --> 00:54:47.560] Yes, so Cecil and I are going to do a live show of the No Rogan experience at QED.
[00:54:47.560 --> 00:54:50.680] So that is, for listeners who aren't aware, I don't mention it too often.
[00:54:50.680 --> 00:54:51.800] I did when I first launched it.
[00:54:51.800 --> 00:55:13.640] I do a show with Cecil Cicarello
Prompt 2: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 3: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Prompt 5: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 2 of 2 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
inful constantly every single second of the game.
[00:53:48.320 --> 00:53:51.840] So she's now done a knee in the uninjured leg.
[00:53:51.840 --> 00:53:56.000] Presumably because she's been carrying a weight funny.
[00:53:56.000 --> 00:53:58.240] And when they said that, she was like, she laughed.
[00:53:58.240 --> 00:54:00.480] She was like, yeah, it's been insanely painful.
[00:54:00.720 --> 00:54:01.760] What is wrong with you?
[00:54:01.760 --> 00:54:02.880] You are not human.
[00:54:02.880 --> 00:54:03.280] Yeah.
[00:54:03.280 --> 00:54:04.640] Incredible.
[00:54:08.160 --> 00:54:15.920] So for QED, we've got the announcement of two new podcasts that are going to be appearing in the QED podcast room at QED.
[00:54:15.920 --> 00:54:19.360] The first of those is, of course, going to be Incredulous.
[00:54:19.360 --> 00:54:23.440] Incredulous has been on the QED main stage for the whole run of QED.
[00:54:23.440 --> 00:54:26.640] It has every year of QED so far, right up until the last one.
[00:54:27.600 --> 00:54:28.760] We're going to have an Incredulous.
[00:54:28.800 --> 00:54:30.280] Yep, which should be very, very fun.
[00:54:30.280 --> 00:54:31.160] It's always a good time.
[00:54:31.160 --> 00:54:32.120] It's always fun.
[00:54:32.120 --> 00:54:33.240] It's going to be a fantastic time.
[00:54:33.240 --> 00:54:34.360] We're looking forward to that.
[00:54:29.840 --> 00:54:36.520] And that, of course, is going to be hosted by Andy Wilson.
[00:54:36.760 --> 00:54:39.160] So you should definitely come along and watch that.
[00:54:39.160 --> 00:54:42.520] The other podcast that we've got is the No Rogan Experience.
[00:54:42.520 --> 00:54:47.560] Yes, so Cecil and I are going to do a live show of the No Rogan experience at QED.
[00:54:47.560 --> 00:54:50.680] So that is, for listeners who aren't aware, I don't mention it too often.
[00:54:50.680 --> 00:54:51.800] I did when I first launched it.
[00:54:51.800 --> 00:55:13.640] I do a show with Cecil Cicarello from Citation Needed and Cognitive Dissonance, where we watch episodes of Joe Rogan and explain where the errors in thinking are, where the conspiracy is being pushed, whether there's facts that are incorrect, and essentially what is happening on the biggest podcast in the world that's influencing huge, huge numbers of people.
[00:55:13.640 --> 00:55:14.600] What's actually going on?
[00:55:14.600 --> 00:55:20.760] And what should we know about it as people who are trying to promote a better way of addressing the world and thinking about stuff?
[00:55:20.760 --> 00:55:21.800] So we're going to do a live show.
[00:55:22.040 --> 00:55:35.720] We're thinking it's going to be a slightly different structure to a normal episode, but I think we can do something with the 45 to 50 minute space in the room that should be interesting and give a good feel for what we're doing and should be a good live experience.
[00:55:35.720 --> 00:55:39.240] So yeah, we're excited to do our first live show at QED.
[00:55:39.240 --> 00:55:40.600] That's very exciting.
[00:55:40.600 --> 00:55:58.680] So if you would like to watch the live podcast stream at QED, you can do so by going to QDCon.org and purchase an online ticket that is £49 and gets you access to the main stage content, the panel room content, and the live podcasts across the entire weekend.
[00:55:59.320 --> 00:56:01.920] Including this show live, including a live skeptics with a k included.
[00:56:02.040 --> 00:56:04.040] It's going to be a live version of this show as well.
[00:56:04.040 --> 00:56:07.800] And you can find more information about QED at QEDcon.org.
[00:56:08.120 --> 00:56:10.040] I should also plug our Patreon.
[00:56:10.040 --> 00:56:18.480] So, if you enjoy the show, if you enjoy what we do and you would like to support us, you can do that by going to patreon.com forward slash skeptics with a K, where you can donate for as little as a pound a month.
[00:56:18.800 --> 00:56:27.680] That money helps us, it helps with the running costs of the show, and also it gives you access to an ad-free version of this podcast.
[00:56:28.000 --> 00:56:30.640] Similarly, Merseyside Skeptics also have a Patreon.
[00:56:30.800 --> 00:56:33.360] You can access that at patrion.com forward slash Merseyskeptics.
[00:56:33.360 --> 00:56:40.560] And if you like what we're doing and you want to support the Merseyside Skeptics, you can do that by going to their Patreon, where you will also get an ad-free version of this show.
[00:56:40.560 --> 00:56:41.360] Yep.
[00:56:41.360 --> 00:56:43.840] Aside from that, then, I think that is all we have time for.
[00:56:43.840 --> 00:56:44.560] I think it is.
[00:56:44.560 --> 00:56:47.280] All that remains then is for me to thank Marsh for coming along today.
[00:56:47.280 --> 00:56:47.840] Cheers.
[00:56:47.840 --> 00:56:48.800] Thank you to Alice.
[00:56:48.800 --> 00:56:49.280] Thank you.
[00:56:49.280 --> 00:56:51.840] We have been Skeptics with a K, and we will see you next time.
[00:56:51.840 --> 00:56:52.560] Bye now.
[00:56:52.560 --> 00:56:53.600] Bye.
[00:56:58.400 --> 00:57:03.440] Skeptics with a K is produced by Skeptic Media in association with the Merseyside Skeptic Society.
[00:57:03.440 --> 00:57:12.720] For questions or comments, email podcast at skepticswithakay.org, and you can find out more about Merseyside Skeptics at mercyside skeptics.org.uk.
Prompt 6: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 7: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Full Transcript
[00:00:00.240 --> 00:00:05.680] I'm no tech genius, but I knew if I wanted my business to crush it, I needed a website now.
[00:00:05.680 --> 00:00:07.680] Thankfully, Bluehost made it easy.
[00:00:07.680 --> 00:00:12.400] I customized, optimized, and monetized everything exactly how I wanted with AI.
[00:00:12.400 --> 00:00:14.160] In minutes, my site was up.
[00:00:14.160 --> 00:00:15.200] I couldn't believe it.
[00:00:15.200 --> 00:00:18.240] The search engine tools even helped me get more site visitors.
[00:00:18.240 --> 00:00:21.680] Whatever your passion project is, you can set it up with Bluehost.
[00:00:21.680 --> 00:00:24.720] With their 30-day money-back guarantee, what do you got to lose?
[00:00:24.720 --> 00:00:26.400] Head to bluehost.com.
[00:00:26.400 --> 00:00:30.880] That's B-L-U-E-H-O-S-T.com to start now.
[00:00:37.600 --> 00:00:45.760] It is Thursday, the 31st of July, 2025, and you're listening to Skeptics with a K, the podcast for science, reason, and critical thinking.
[00:00:45.760 --> 00:00:56.960] Skeptics with a K is produced by Skeptic Media in association with the Merseyside Skeptic Society, a non-profit organization for the promotion of scientific skepticism on Merseyside around the UK and internationally.
[00:00:56.960 --> 00:00:58.320] I'm your host, Mike Hall.
[00:00:58.320 --> 00:00:59.520] With me today is Marsh.
[00:00:59.520 --> 00:01:00.080] Hello.
[00:01:00.080 --> 00:01:00.960] And Alice.
[00:01:00.960 --> 00:01:01.760] Hello.
[00:01:02.080 --> 00:01:07.920] Well, it's actually the time of the year where we ask people for their nominations for our Occam Awards.
[00:01:07.920 --> 00:01:08.960] The OCAMS.
[00:01:08.960 --> 00:01:09.840] The Occam Awards.
[00:01:09.840 --> 00:01:12.400] So the Occam Awards, these have been going for a long time.
[00:01:12.560 --> 00:01:17.600] This is before we even took over the Skeptic Magazine as editors and publisher of the magazine.
[00:01:17.600 --> 00:01:22.160] They're there to celebrate the very best in skeptical work over the last year.
[00:01:22.160 --> 00:01:34.080] So we ask for nominations as to who you think has done some excellent work, where have they had a wide reach, where they had an impact, maybe they've even changed something for the better, actually had a tangible impact in the world.
[00:01:34.080 --> 00:01:35.680] Who's doing some excellent work?
[00:01:35.680 --> 00:01:38.640] Or even who's doing work in the face of some serious adversity?
[00:01:38.640 --> 00:01:43.760] You know, who's actually coming up against some serious opposition from the pseudo-scientists and things out there.
[00:01:43.760 --> 00:01:46.480] So we always give out essentially the skeptic of the year award.
[00:01:46.480 --> 00:01:50.080] It's not quite called that, but that is essentially thematically what it's all about.
[00:01:50.080 --> 00:01:56.640] And then the other thing we will be giving out will be the Rusty Razor Award, which is the exact opposite of that.
[00:01:56.640 --> 00:01:59.880] So, who out there has been peddling pseudoscience?
[00:01:59.600 --> 00:02:02.840] What organization has been pushing misinformation?
[00:02:03.160 --> 00:02:12.760] What product is completely nonsense and woo in a way that we ask people to think about what harm has actually been done and how far has it actually got and who how many people are believing in?
[00:02:12.760 --> 00:02:15.000] Is this making the world appreciably worse?
[00:02:15.000 --> 00:02:24.120] Essentially, and if you go to the website of the skeptic magazine, so skeptic.org.uk, you can go there and you can find the nomination form for the Occams and the Rusty Razor.
[00:02:24.120 --> 00:02:28.520] And you can tell us who deserves either one or both, and you can tell us your justification.
[00:02:28.680 --> 00:02:43.240] We will look through the nominations, and as an editorial board, we'll decide who is the winner, and then we'll announce that winner of the skeptic of the year and of the rusty razor award at this year's QED on the Saturday night before the Gala Dinner, before the Gala Dinner, before the comedy at QED.
[00:02:43.240 --> 00:02:45.400] So, we announce it there on stage every year.
[00:02:45.400 --> 00:02:48.680] So, we want lots and lots of people to send us their nominations.
[00:02:52.520 --> 00:02:54.200] Porn porn, porny porn.
[00:02:54.200 --> 00:02:55.640] We're going to talk a lot about porn.
[00:02:55.640 --> 00:03:03.880] So, if you don't want to hear about porn or you're listening with someone who you don't want to hear about porn, then stop listening before we start talking about the porn, porn, porn, porn, porn.
[00:03:03.880 --> 00:03:10.200] Yeah, what if you're with somebody who you don't even want to know that porn as a concept exists?
[00:03:10.200 --> 00:03:12.760] Too late now, yeah, very much so.
[00:03:12.760 --> 00:03:17.080] That cat's out the back because it's not like you said it once, no, and it could have slipped by.
[00:03:17.080 --> 00:03:20.200] You very much made it front and center of every part of the conversation, yeah.
[00:03:20.200 --> 00:03:25.640] And there are much more succinct ways I could have said that, I imagine, but you know, fuck it, my show in it.
[00:03:25.640 --> 00:03:37.000] So, in the last few days, some key provisions of the Online Safety Act have come into effect following many, many years of delays and implementation difficulties.
[00:03:37.000 --> 00:03:53.280] The Act creates a duty of care for online service providers to not only take action against illegal content, which is commonly misunderstood to refer to sexual abuse material, although it's other stuff as well, but mostly that's where people's attention is focused.
[00:03:53.280 --> 00:04:00.720] But also legal content, which nevertheless may be harmful for children, which is commonly understood to mean porn.
[00:04:00.720 --> 00:04:01.520] Yes, yeah.
[00:04:01.840 --> 00:04:06.800] Now, we first talked about this way back in episode 113.
[00:04:06.800 --> 00:04:07.440] Did we?
[00:04:07.440 --> 00:04:16.800] When the David Cameron government was trying to implement very similar legislation at that point, where this was known as the David Cameron porn filter.
[00:04:16.800 --> 00:04:17.360] Yeah.
[00:04:17.600 --> 00:04:26.080] And our joke at the time was this is a filter introduced by David Cameron to remove porn, not a filter to remove David Cameron porn.
[00:04:26.400 --> 00:04:29.200] Although you should definitely be removing David Cameron.
[00:04:29.280 --> 00:04:31.680] It would actually function as porn, essentially, yes.
[00:04:32.000 --> 00:04:46.880] It was widely criticized at the time, the David Cameron porn filter, for being both a naive and misguided piece of legislation, which is what some people claimed, and other people claimed it was a deliberately Orwellian attempt to control the flow of information into the UK.
[00:04:46.880 --> 00:04:47.680] Yeah.
[00:04:48.320 --> 00:04:55.520] And I criticized it at the time for being based on a rather flimsy appeal to emotion, that being what think of the children.
[00:04:55.760 --> 00:04:56.880] Yeah, yeah, absolutely.
[00:04:56.880 --> 00:05:00.400] But also on the basis that it wouldn't work anyway.
[00:05:00.400 --> 00:05:05.120] The filter as proposed would not actually function in the way that they expected.
[00:05:05.120 --> 00:05:16.480] Think of the children is also a very common argument you'll find in the more pearl-clutchy end of politics, which can be both from the left or the right, but it's very, very commonly cited in there.
[00:05:16.480 --> 00:05:29.840] And I would also argue that that's a logical fallacy as well as a thought-terminating cliché in the sense that once someone has said, but think of the children, any attempt to continue the conversation is, well, you want to hurt children then.
[00:05:29.840 --> 00:05:30.680] Yeah, yeah.
[00:05:31.000 --> 00:05:36.440] And it's also the argument from children is a key indicator of a moral panic as well.
[00:05:36.440 --> 00:05:39.800] So many moral panics are founded on, but it's hurting our children.
[00:05:40.040 --> 00:05:40.760] Yeah, exactly.
[00:05:40.760 --> 00:05:58.360] And especially when they say children, you don't think people who might be 17 falling under the age of 18 who are caught up in bans on porn and therefore the very weird situation where somebody can be legally having sex with their person they are legally allowed to have sex with.
[00:05:58.360 --> 00:06:08.760] But if they were to film themselves doing so, they'd be committing a serious sexual offense, even with everybody involved completely consensually and those kind of weird things.
[00:06:08.760 --> 00:06:17.480] I think there was even a story of somebody who got into bother for possessing child sexual abuse material because he was 16 or 17 and it was a picture of himself.
[00:06:19.400 --> 00:06:28.120] The current version of the online safety bill was initially introduced by the Boris Johnson government, although it started being drafted under Theresa May.
[00:06:28.200 --> 00:06:30.360] It was actually introduced under Boris Johnson.
[00:06:30.360 --> 00:06:32.280] So like three prime ministers ago.
[00:06:32.280 --> 00:06:35.640] I would say it's hard to know how many prime ministers that actually is at this point.
[00:06:35.640 --> 00:06:42.040] Like when you said Boris Johnson, I genuinely had to stop and think, I can't remember when that was because so much has happened since then.
[00:06:42.440 --> 00:06:45.560] It is three prime ministers ago, but it has since become law.
[00:06:45.560 --> 00:06:45.960] Yeah.
[00:06:45.960 --> 00:06:48.840] It was given royal assent, I think, October 2023.
[00:06:48.840 --> 00:06:54.120] So just in the fading days of the Rishi Sunak government, this finally came in law.
[00:06:54.120 --> 00:06:57.000] And the provisions are starting to come into effect now.
[00:06:57.000 --> 00:06:59.880] And that is a weird quirk of British law.
[00:07:00.040 --> 00:07:12.760] And maybe not just British law, but of law generally, that a government that was very clearly dying, like the Rishi Sunak government, by the late late October of 2023, it was pretty clear it was not going to win the next election.
[00:07:13.240 --> 00:07:14.600] That was pretty nailed on.
[00:07:14.600 --> 00:07:21.680] So that government was able to pass a law that it handed to its successors to say, go on, then make this work.
[00:07:22.240 --> 00:07:22.960] And then left.
[00:07:22.960 --> 00:07:26.800] And now it comes in and it's like, how dare Kia's Dharma's government do this?
[00:07:26.800 --> 00:07:29.120] Yes, and I'm seeing a lot of commentary along those lines.
[00:07:29.440 --> 00:07:31.280] Yeah, Starmer's totalitarianism.
[00:07:31.280 --> 00:07:34.080] And look, there are things that they could have done.
[00:07:34.320 --> 00:07:36.640] They absolutely could have handled this way, way better.
[00:07:36.640 --> 00:07:36.880] Yeah.
[00:07:36.880 --> 00:07:38.400] But they didn't propose this law.
[00:07:38.400 --> 00:07:39.760] They were forced to go ahead with it.
[00:07:40.880 --> 00:07:41.200] Absolutely.
[00:07:41.760 --> 00:07:42.480] You know what I mean?
[00:07:42.480 --> 00:07:43.040] Yeah.
[00:07:43.360 --> 00:07:49.680] So as of July the 25th, 2025, so this is very, very recent.
[00:07:49.680 --> 00:07:54.880] Many websites have introduced mandatory age verification for UK users.
[00:07:55.440 --> 00:07:58.240] This includes porn sites, of course, which makes sense.
[00:07:58.240 --> 00:08:00.480] It also includes many not porn sites.
[00:08:00.480 --> 00:08:01.200] Which doesn't.
[00:08:01.200 --> 00:08:06.880] Because some of those not porn sites nevertheless host some explicit content.
[00:08:06.880 --> 00:08:12.080] So Blue Sky, for example, now requires age verification for users in the UK.
[00:08:12.080 --> 00:08:13.200] So do Reddit.
[00:08:13.200 --> 00:08:14.880] So does Discord.
[00:08:14.880 --> 00:08:22.000] And also, of course, Reddit and Discord, it depends on which Reddits you're looking at and which Discord servers you're signed up to.
[00:08:22.000 --> 00:08:27.520] But if you are signed up to ones that are deemed to be adult, then you have to age verify now.
[00:08:27.520 --> 00:08:33.920] And as well as that, you also have full-on proper porn sites like Pornhub and Red Tube and X Videos and so on.
[00:08:33.920 --> 00:08:36.640] And you have to age verify to access those now.
[00:08:36.640 --> 00:08:40.640] Which reminds me, this episode of Skeptics with the K is sponsored by X Videos.
[00:08:40.640 --> 00:08:44.080] Try X Videos in case you need a wink.
[00:08:45.040 --> 00:08:45.520] So anyway.
[00:08:45.680 --> 00:08:46.320] X Videos.
[00:08:46.320 --> 00:08:47.920] I thought it was 10 videos.
[00:08:48.400 --> 00:08:49.200] Only 10 of them on the 10th.
[00:08:49.280 --> 00:08:50.160] Yeah, they've been on the website.
[00:08:50.160 --> 00:08:51.360] I assume there's only the documents.
[00:08:51.000 --> 00:08:52.080] There's just a dozen.
[00:08:52.080 --> 00:08:55.360] Yeah, you have to change them regularly when a new piece of porn comes out.
[00:08:56.000 --> 00:08:59.720] It's just the most recent 10 porn videos on the internet, is what it is.
[00:08:59.600 --> 00:09:05.320] So they change, they update every week or so when more content gets added to the internet.
[00:09:06.280 --> 00:09:08.440] Anyway, so this is now the law of the land.
[00:09:08.440 --> 00:09:12.280] But there are some significant issues with this legislation.
[00:09:12.280 --> 00:09:20.040] So in the interests of locking the stable door after the horse has bolted, I want to talk about what the issues with this are.
[00:09:20.040 --> 00:09:21.640] Because there's nothing we can do about it now.
[00:09:21.640 --> 00:09:23.080] It's already come into effect.
[00:09:23.080 --> 00:09:27.400] There is a parliamentary petition saying please repeal the Online Safety Act.
[00:09:27.640 --> 00:09:29.640] It's had over 300,000 signatures.
[00:09:29.640 --> 00:09:31.640] It's due a response and a debate.
[00:09:31.640 --> 00:09:34.040] It's not going to get anywhere because those positions never do.
[00:09:34.040 --> 00:09:36.840] I think it's been one ever since they were introduced.
[00:09:36.840 --> 00:09:43.560] And I think that was by the Gordon Brown government that introduced those petitions and one has ever actually changed the law.
[00:09:43.560 --> 00:09:44.840] Here's the thing though.
[00:09:44.840 --> 00:09:50.680] How good are the spam filters and text filters on the petition website?
[00:09:50.680 --> 00:10:09.320] Because could you sign up with names that had inserts of explicit material, therefore forcing the petition website to be the publisher of explicit material and therefore forcing the government's petition website to require age verification for accessing?
[00:10:09.640 --> 00:10:11.080] Possibly.
[00:10:11.400 --> 00:10:13.400] I think someone should try that.
[00:10:14.360 --> 00:10:21.240] So as mentioned, many platforms have introduced age verification to comply with the requirements of this bill.
[00:10:21.240 --> 00:10:29.320] So if you want to access this sort of content, probably because you fancy having a wank, you now have to demonstrate that you are over the age of 18 to do it.
[00:10:29.320 --> 00:10:31.880] And happily, teenagers never want a wank.
[00:10:32.040 --> 00:10:32.520] No, they don't.
[00:10:32.760 --> 00:10:33.720] Famously.
[00:10:33.720 --> 00:10:37.400] So we lucked out there, because otherwise that could have been really awkward.
[00:10:37.400 --> 00:10:42.760] Blue Sky, for example, have implemented this using a service called KWS, which is kids' web services.
[00:10:42.760 --> 00:10:46.800] That is a service provided by Epic Games, actually the people who made Fortnite.
[00:10:44.760 --> 00:10:50.560] So they had this service off the shelf that they were using for age-verifying players anyway.
[00:10:50.880 --> 00:10:56.800] Blue Sky has integrated that into their system, and this verifies you were an adult using one of two mechanisms.
[00:10:56.800 --> 00:11:04.240] You can either enter your credit card number or you can have an AI look at you and guess how old you are by looking at your face.
[00:11:04.880 --> 00:11:06.720] I bet that works spectacularly well.
[00:11:06.880 --> 00:11:07.360] It does.
[00:11:07.360 --> 00:11:08.880] And it's unhackable.
[00:11:09.520 --> 00:11:14.400] Both of these mechanisms have got significant problems with both false positives and false negatives.
[00:11:14.400 --> 00:11:18.080] Have you seen how good the youth of today are at makeup?
[00:11:18.080 --> 00:11:23.840] I don't consider incredible at makeup.
[00:11:23.840 --> 00:11:29.920] Like, the younger generations are properly nailed on makeup artists, essentially.
[00:11:29.920 --> 00:11:34.640] I mean, to be honest, if they're going through the effort of putting makeup on, they're doing it the hard way to get around these.
[00:11:34.720 --> 00:11:35.440] Oh, well, yes, true.
[00:11:36.400 --> 00:11:45.600] And the good thing about the credit card is there's never a point in any teenagers' history where they are anywhere near a parent's credit card for a minute's worth of time.
[00:11:45.760 --> 00:11:46.320] Never happens.
[00:11:46.480 --> 00:11:48.000] We'll come back to that.
[00:11:48.000 --> 00:11:53.200] So, there is a very well-regarded organization in the United States called NIST, which is part of the U.S.
[00:11:53.200 --> 00:11:53.600] government.
[00:11:53.600 --> 00:11:56.800] It's the National Institute of Standards and Technology.
[00:11:56.800 --> 00:12:02.720] And last year, they published their latest evaluation of age estimation software.
[00:12:02.720 --> 00:12:15.520] So, for this evaluation, they took 11 million pictures of people with known ages and ran them through six different software systems that estimated the age of the person based on the picture.
[00:12:15.840 --> 00:12:18.640] And we're going to look at three of their key metrics.
[00:12:18.640 --> 00:12:19.200] Okay.
[00:12:19.840 --> 00:12:21.880] So, the first one, the first metric is.
[00:12:22.120 --> 00:12:26.560] Is it where they're wearing a badge saying 21 today.
[00:12:26.560 --> 00:12:28.640] Yes, it's a dead giveaway.
[00:12:28.640 --> 00:12:29.040] Yeah.
[00:12:29.040 --> 00:12:31.720] The first one is mean absolute error.
[00:12:29.840 --> 00:12:36.520] So this is how far away from the person's true age is the computer's guess.
[00:12:36.840 --> 00:12:38.600] So mean absolute error.
[00:12:38.600 --> 00:12:52.600] And NIST reported that depending on several factors like the quality of the photograph and which algorithm was used, the mean absolute error ranges from 2.3 years to 5.1 years, which is quite a large error, actually.
[00:12:52.600 --> 00:13:00.520] It's maybe not so much of a problem if you're 56 and the computer incorrectly guesses that you're 51, because it doesn't make a blind bit of difference, really.
[00:13:00.520 --> 00:14:01.200] But it makes a huge difference when you're 23 and the computer thinks you look 17 and now stops you from accessing things you are legally part of your life you are legally allowed to explore yes that you'll be locked out because a computer says no and there's no way for you to particularly argue with that yeah i'm interested i don't know i assume you've not covered this because it's not relevant to the point you're making but i'm interested if humans are better at that especially if you're seeing a person in person because of other like unmeasurable cues that we just automatically pick up humans can be shit at it as well but i'd be i'd be kind of interested to know if humans are a bit better than machines the problem from a from a purely theoretical point of view that would be an interesting question from a policy point of view what will happen is the humans will ask everyone we know that because what we're talking about is challenge 25 that we found out alcohol forever i don't want to rely on whether i'm good at guessing this so i will just ask i'll just make everybody provide identification before they access alcohol.
[00:14:01.520 --> 00:14:04.320] I'm not making a point at all at all about it.
[00:14:04.640 --> 00:14:06.560] Just vaguely out of interest.
[00:14:06.560 --> 00:14:13.760] Yeah, but it feels like something that humans would be better at than machines for some intangible, unknowable, unmeasurable reason.
[00:14:13.760 --> 00:14:23.760] But the reason I'm bringing that together is I think this is downstream of the challenge 21, challenge 25 kind of mentality of like, well, you have to prove it at this point, otherwise, we're going to draw a line.
[00:14:23.760 --> 00:14:31.440] And it's the same sort of thing to me: is that now you have to prove your age for these other services, and before you know it, it's ID for everything.
[00:14:31.440 --> 00:14:33.360] I know that sounds slippery slope, but yeah.
[00:14:33.680 --> 00:14:37.280] So, yeah, it's a problem if you're 23 and the computer says you look 17.
[00:14:37.280 --> 00:14:42.320] It's arguably a bigger problem if the computer says you look 18, but you're actually 12.
[00:14:42.560 --> 00:14:42.880] Yes.
[00:14:42.880 --> 00:14:43.840] Yes, yeah, yeah, yeah.
[00:14:43.840 --> 00:14:49.360] Because the stated goal here is to protect children from accessing content which the government says is harmful.
[00:14:49.360 --> 00:14:58.160] And that fails spectacularly when your software lets through 12 and 13 year olds because the face estimation software has got error bars of five and a bit years.
[00:14:58.160 --> 00:15:14.800] I wonder whether those error bars are weighted in one direction, though, whether it's more likely to underestimate your age rather than overestimate it for very significant, like 12, you're unlikely to mistake a 12-year-old for an 18-year-old because of the physiological changes that happen very, very quickly between the ages of 11 and 15 or whatever.
[00:15:14.800 --> 00:15:17.440] Possibly, we've got other metrics that we can look at for that.
[00:15:17.440 --> 00:15:18.000] Okay.
[00:15:18.000 --> 00:15:28.800] So, NIST were, of course, mindful that the true purpose of face age estimation software is to facilitate access to adult services, whether that's porn or buying alcohol or whatever.
[00:15:29.120 --> 00:15:34.320] So, the second metric that they looked at was the binary false negative rate.
[00:15:34.320 --> 00:15:38.640] So, that is to say, if you ask the software, is this person of legal age?
[00:15:38.640 --> 00:15:42.400] How often does the software say no, even when you are?
[00:15:42.800 --> 00:15:43.520] Right?
[00:15:43.840 --> 00:15:50.160] And what they found in that test, was errors in the range between 4% and 30%.
[00:15:50.480 --> 00:16:05.080] So, that is to say, for people who were adults, between 4 and 30% of the time, depending on the algorithm used, the quality of the picture, that sort of stuff, between 4 and 30% of the time, the software would say, no, sorry, you're too young, and would deny access to that.
[00:16:05.880 --> 00:16:08.680] That's nearly a third of times on the outside edge.
[00:16:08.680 --> 00:16:14.840] Did they break any of that down by, for example, protected characteristics status?
[00:16:14.840 --> 00:16:20.120] Yes, well, we'll come to the demographic confounders in a moment as well.
[00:16:20.120 --> 00:16:27.800] So the flip side of that, again, arguably more concerning given the goals of this legislation, is the binary false positive rate.
[00:16:28.200 --> 00:16:29.880] So that's the third metric.
[00:16:29.880 --> 00:16:37.640] NIST took images of children aged 13 to 17 and again asked the software, is this person of age?
[00:16:37.640 --> 00:16:42.520] And in some cases, the error rate was up to 45% in these cases.
[00:16:42.520 --> 00:16:49.160] So in 45% of cases, again, depending on the quality of the picture, which algorithm was being used, that sort of thing.
[00:16:49.160 --> 00:16:58.040] But in up to 45% of the cases, it said, yes, that person is an adult when they were aged between 13 and 17 years old.
[00:16:59.400 --> 00:17:09.720] So in this case, in the case before, where for the false negative rate, in the event of an error, an adult is denied access to services that they should be allowed to access.
[00:17:10.040 --> 00:17:17.400] In the case of the false positive error, a child is granted access to pornography, which based on this face age estimation software.
[00:17:17.400 --> 00:17:19.000] So that's not brilliant, right?
[00:17:19.640 --> 00:17:38.120] So from both ends of this argument, whether your position is this is censorship, this is terrible, adults should be able to access whatever they want, and now 30% of adults can't, or whether your position is think of the children, they need to be protected from the heinous things in this world, but 45% of children are granted access to this anyway.
[00:17:38.600 --> 00:17:42.040] Either way round, you look at this, the system is not sufficient.
[00:17:42.680 --> 00:17:45.360] And nobody should be happy with these metrics.
[00:17:44.520 --> 00:17:49.120] These problems are then further exacerbated by demographic biases.
[00:17:49.200 --> 00:18:06.480] So, for white faces compared to Asian and African faces, NIST estimates that the error rates for Asian and African faces are one to two orders of magnitude higher, with some software estimating the face of Africans with an error of over 40 years.
[00:18:06.480 --> 00:18:08.480] Now, that's on the young side.
[00:18:08.480 --> 00:18:12.080] It's older Africans being estimated as much younger than they are.
[00:18:12.080 --> 00:18:19.200] But nevertheless, this age-guessing software is dreadful at guessing the age of non-white people.
[00:18:19.200 --> 00:18:20.000] Yeah.
[00:18:20.320 --> 00:18:25.360] In terms of other protected characteristics, they don't look at that in the paper that I looked at.
[00:18:25.360 --> 00:18:29.840] But I can imagine certain disabilities being impacted significantly.
[00:18:30.320 --> 00:18:30.880] Yeah.
[00:18:31.840 --> 00:18:38.640] The fallback mechanism that's used by many of these services is the old favorite of the Porny website, which is the credit card check.
[00:18:38.640 --> 00:18:47.440] So, if you fail the face age or you just don't fancy the face age, the software can say, all right, you look young to me, so try this instead.
[00:18:47.760 --> 00:18:55.440] And children cannot, they can legally have a debit card in the UK, but you cannot get a credit card unless you are aged 18.
[00:18:55.680 --> 00:18:57.440] Oh, it has to be a credit card, not a debit card.
[00:18:57.440 --> 00:18:58.240] It has to be a credit card.
[00:18:58.480 --> 00:18:58.960] Oh, my God.
[00:18:59.200 --> 00:19:07.360] That's a major issue because just having a credit card is a risk factor for getting into credit card debt.
[00:19:07.360 --> 00:19:08.080] Yes.
[00:19:08.400 --> 00:19:13.760] And so, some services like Blue Sky have implemented a credit card check as a fallback mechanism.
[00:19:13.760 --> 00:19:19.280] So, they don't charge your card, or if they do, they'll charge a nominal amount, like 5p.
[00:19:19.600 --> 00:19:24.800] But all they're doing is checking: is this card valid and is it a credit, not a debit card?
[00:19:24.800 --> 00:19:28.560] Because it has to be a credit card because credit cards are only available to over 18s.
[00:19:28.640 --> 00:19:30.600] Debit card, you get one when you're fucking 12, right?
[00:19:29.920 --> 00:19:36.840] But they're, of course, not matching it to your name and shouldn't be matching it to your name because that would be majorly problematic.
[00:19:37.160 --> 00:19:42.200] But also, that means, as you say, you can go into your mom's wallet and grab a credit card.
[00:19:42.200 --> 00:19:46.520] I mean, and obviously, yes, you have to give someone credit card details for that.
[00:19:46.520 --> 00:19:49.080] And okay, I really, really trust this system they put in.
[00:19:49.080 --> 00:19:51.160] I really trust whoever they've hired to do this.
[00:19:51.160 --> 00:19:53.160] And the good thing is, these things never get hacked.
[00:19:53.320 --> 00:19:54.280] Never ever get hacked.
[00:19:54.280 --> 00:19:56.280] No, so there's a few problems with that.
[00:19:56.280 --> 00:20:01.640] We've touched on a couple of them, but many adults, especially those on low incomes, do not have credit cards.
[00:20:01.640 --> 00:20:02.360] Yep.
[00:20:02.360 --> 00:20:07.960] So, according to the website Merchant Savvy, in 2024, 35% of adults do not have a credit card.
[00:20:07.960 --> 00:20:09.640] Yeah, yeah, neither of my parents had credit cards.
[00:20:09.640 --> 00:20:13.320] I grew up, we never had a credit card in the house the first time with a credit card.
[00:20:13.320 --> 00:20:15.400] I got it maybe eight years ago.
[00:20:15.400 --> 00:20:21.880] Yeah, that goes up to 71% of adults aged between 18 and 24 who do not have a credit card.
[00:20:21.880 --> 00:20:25.720] And these are the adults who are the ones most likely to fail the face age check.
[00:20:25.800 --> 00:20:27.800] Yeah, just like because they're closer to the calf.
[00:20:27.800 --> 00:20:28.040] Yeah.
[00:20:28.040 --> 00:20:28.600] Yeah.
[00:20:28.600 --> 00:20:31.960] 71% of adults age 18 to 24 do not have a credit card.
[00:20:31.960 --> 00:20:37.560] So even this fallback mechanism is locking adults out of legal content.
[00:20:37.560 --> 00:20:41.240] Albeit adult content, but content that they are legally able to access.
[00:20:41.240 --> 00:20:45.160] Or forcing them to get a credit card, forcing them to go through a credit card check.
[00:20:45.160 --> 00:20:49.640] Possibly damaging their credit rating, possibly encouraging them to get into debt and so on.
[00:20:49.640 --> 00:20:56.200] It is literally saying if you want to access OnlyFans, you first need to have a credit card that you put into the website, OnlyFans.
[00:20:56.920 --> 00:20:58.280] Now don't get into bother.
[00:20:58.280 --> 00:21:07.560] So assuming an adult population of 55 million people, if 30% of them fail the phase age check, that's 16.5 million people who fall back to the credit card check.
[00:21:07.560 --> 00:21:16.560] 35% of those don't have credit cards, that's 5.5 million adults who could be denied lawful access to content in the name of protecting children.
[00:21:14.760 --> 00:21:18.400] Obviously, that's back of the envelope numbers.
[00:21:18.480 --> 00:21:24.160] It's not going to be as straightforward as that, but that's you know, that's roughly where we're going to be.
[00:21:24.800 --> 00:21:27.520] So that's the false negative rate for credit cards.
[00:21:27.520 --> 00:21:29.040] What about the false positive rate?
[00:21:29.040 --> 00:21:35.360] How often do credit do credit card checks effectively prevent children from accessing adult content?
[00:21:36.000 --> 00:21:44.640] The Online Safety Act explicitly says a credit card is an acceptable tool for the purposes of age verification.
[00:21:44.640 --> 00:21:56.240] But it is not difficult to imagine, as we've already said, a 14, 15, 16-year-old wanting access to some assault service and asking their mum and dad to lend them a card or just taking the card without permission.
[00:21:56.240 --> 00:21:59.920] And Tesco's won't let you do it for buying alcohol.
[00:21:59.920 --> 00:22:15.040] I've been into Tesco before or any other supermarket before where I've been age verified for buying alcohol and not had my ID with me because I didn't think I needed an ID at 35 and gone, well, look, I've got a credit card, like at least that proves them over 18, right?
[00:22:15.040 --> 00:22:16.480] And they've gone, no, we won't take that.
[00:22:16.640 --> 00:22:16.960] Yeah.
[00:22:16.960 --> 00:22:21.840] Well, the Online Safety Act says it is explicitly acceptable as a form of ID.
[00:22:21.840 --> 00:22:37.200] I have not been able to find specific statistics about the misuse of credit cards by minors in the UK, but the Citizens Advice website says that it is, quote, quite common for a family member to use your card without permission.
[00:22:37.840 --> 00:22:46.080] But even leaving aside fraudulent use, the use of Discord, for example, is extremely widespread amongst teenagers.
[00:22:46.080 --> 00:22:50.080] It's a very, very widely used platform for friends to communicate with each other.
[00:22:50.080 --> 00:22:54.080] And now, Discord is requiring age verification to access adult content.
[00:22:54.080 --> 00:23:09.320] However, mum and dad might not realize that that's what the verification is for and may allow their child to use the card to verify with Discord, not really appreciating that this isn't a simple verify my account, it's an age check to access adult content.
[00:23:09.320 --> 00:23:15.240] Or might know that's what it's for and just trust the kids to make an educational choice for themselves and pop it in.
[00:23:15.240 --> 00:23:22.200] I'm sure plenty of parents will be like, well, I mean, how many parents let the kids play age in appropriate according to the scoring system?
[00:23:22.200 --> 00:23:22.760] Video games.
[00:23:23.000 --> 00:23:26.920] Video games or watch fucking Terminator when they're eight or whatever.
[00:23:27.240 --> 00:23:28.040] Yeah, yeah.
[00:23:28.040 --> 00:23:35.160] Another way to circumvent these checks, which is proving to be very common, is to use a video game with what's called photo mode enabled.
[00:23:35.160 --> 00:23:44.760] So when you're doing the face age estimation checks, typically they will ask you to perform certain actions so the software can tell you're not just holding a photo up of your granddad or something like that.
[00:23:44.760 --> 00:23:48.120] So it might say open your mouth, turn your head left, things like that.
[00:23:48.120 --> 00:23:49.880] You'll have done this when you sign it with Monzo.
[00:23:50.200 --> 00:23:53.160] They've got very similar read a sentence out, yeah.
[00:23:53.160 --> 00:23:53.800] Yeah.
[00:23:54.120 --> 00:24:02.760] But these are also actions that you can get in-game characters to perform in photo mode in games like Death Stranding.
[00:24:03.080 --> 00:24:06.120] So you can make the game character pose in a certain way.
[00:24:06.120 --> 00:24:09.960] You point your camera at the screen and it's a photo reel game character.
[00:24:09.960 --> 00:24:21.720] And when the verification tool says, right now open your mouth, you make it open its mouth and then Shalomy Gallami Zoop, there it is, you're approved without pinching anybody's credit card and while you're still 13 years old.
[00:24:22.040 --> 00:24:36.840] So there is comprehensive empirical research that shows that age verification technologies like this fail to meaningfully protect children while also creating substantial privacy and accessibility risks to adult consumers of adult content.
[00:24:36.840 --> 00:24:48.640] Because many people, of course, even if they have a credit card, even if they do have a government ID, even if they would pass the face age checks, who wants their photograph, passport, and credit card associated with an online porn site?
[00:24:48.640 --> 00:24:49.440] Yeah.
[00:24:49.760 --> 00:24:52.800] And it's not even just porn, it's what's deemed adult content.
[00:24:52.800 --> 00:25:00.160] And that category can contain information about gender diversity, information about sexual sexuality diversity.
[00:25:00.160 --> 00:25:08.800] And who wants to give their face and credit card number to the website they visit to figure out if they're gay or not, to explore whether they're gay or not?
[00:25:09.120 --> 00:25:15.760] Rightly or wrongly, we've got a culture which is still very judgmental about porn and the use of pornography as an aid to sex or masturbation.
[00:25:15.760 --> 00:25:23.120] And that social shame will be significant enough a pressure for some people that this effectively amounts to government censorship.
[00:25:23.120 --> 00:25:23.680] Yes.
[00:25:23.680 --> 00:25:26.960] Because there is enough pressure from just from that social stigma.
[00:25:26.960 --> 00:25:39.360] I might add at this point, although this is not the roads that I'm going to go down, that might very well be the point because this sort of age verification legislation is being pushed and lobbied for very heavily by evangelical Christian countries.
[00:25:39.360 --> 00:25:40.080] Yes, it is, yeah.
[00:25:40.080 --> 00:25:50.160] Both here and in the US, with the express intention of using that social stigma to stop even adults from accessing pornography, but dressing it up as child protection.
[00:25:50.480 --> 00:26:01.120] So moving on from the efficacy of verification tools, how effective are the age gate blocks themselves at preventing access to adult content?
[00:26:01.440 --> 00:26:04.400] So there are a couple of components to this.
[00:26:04.720 --> 00:26:07.920] For one particular adult website, it's not one that I've mentioned.
[00:26:07.920 --> 00:26:10.560] I'm not going to mention it, but it is a proper porn site.
[00:26:10.560 --> 00:26:15.680] It's not like Reddit or something where just it's an ordinary site, but some people post porn on it.
[00:26:15.680 --> 00:26:24.720] Upon visiting this site, I was greeted with a big verify your age overlay, and I opened DevTools on Chrome and deleted it and then watched the porn.
[00:26:25.680 --> 00:26:27.360] Absolutely fine, no problem at all.
[00:26:27.360 --> 00:26:27.840] Brilliant.
[00:26:27.800 --> 00:26:28.040] Yep.
[00:26:28.240 --> 00:26:36.120] I was also able to do the same thing from my phone with a feature that Apple have called hide distracting items, which is meant to be used for removing ads.
[00:26:36.200 --> 00:26:39.720] So you can click on the ad and say hide this, and it just deletes it.
[00:26:39.720 --> 00:26:44.520] Also removes the age verification pop-up and just you can watch the porn that's there anyway.
[00:26:44.520 --> 00:26:49.320] Yeah, but the hide distracting items also completely remove what you were looking for in the porn video.
[00:26:49.320 --> 00:26:52.200] So no, this would be way too distracting for you.
[00:26:52.520 --> 00:26:57.880] Now, this isn't something that works on every single adult website, but it works on at least some.
[00:26:57.880 --> 00:27:05.400] It would also be trivial enough to code up a browser extension for Chrome that could detect and remove these age verification overlays.
[00:27:05.400 --> 00:27:18.360] I imagine many children would know that friend who knows how DevTools works or can get them a kind of inline script that's called a bookmarklet where it becomes a bookmark on your button and you just click that and it would disappear.
[00:27:18.360 --> 00:27:24.360] When I was a teenager, my dad put a block on MSN Messenger.
[00:27:24.360 --> 00:27:25.880] Remember when I was a message?
[00:27:26.040 --> 00:27:28.280] So you could only use it for an hour at a time.
[00:27:28.600 --> 00:27:32.440] But I was in a long-distance relationship with Warren at that time, so I wanted to speak to him with mum.
[00:27:32.680 --> 00:27:33.960] I knew all the routes to get around it.
[00:27:33.960 --> 00:27:39.640] Like you just, I was on MSN Messenger for one hour and then I logged into like various different routes that you could use as well.
[00:27:39.640 --> 00:27:40.840] Just use it online.
[00:27:41.320 --> 00:27:42.520] It didn't do anything.
[00:27:42.520 --> 00:27:45.400] It just took me a little bit longer to get access to it.
[00:27:45.400 --> 00:27:47.560] Because teenagers are the best at figuring that stuff out.
[00:27:47.880 --> 00:27:48.920] Digital natives, yeah.
[00:27:48.920 --> 00:27:51.240] And they don't even necessarily need to understand it.
[00:27:51.240 --> 00:27:55.160] They just need to know someone who knows someone who knows someone who understands it, right?
[00:27:55.160 --> 00:27:57.000] Or be capable of Googling.
[00:27:57.000 --> 00:27:59.800] Or be capable of asking AI to tell them.
[00:28:00.040 --> 00:28:02.280] Chat GPT, how do I circumvent this?
[00:28:02.440 --> 00:28:02.840] Oh, cool.
[00:28:02.840 --> 00:28:03.560] Thanks.
[00:28:03.560 --> 00:28:09.000] Those same age verification overlays are also only shown to UK users.
[00:28:09.000 --> 00:28:18.400] So if you fire up your copy of ZoomyZoom VPN, you can access exactly the same content without the age verification checks by just setting your country to Sweden or whatever.
[00:28:18.720 --> 00:28:21.920] Is this story also sponsored by Zoomizoom VPN?
[00:28:21.920 --> 00:28:27.760] It is sponsored by Zoomizum VPN in case you want to wank.
[00:28:28.320 --> 00:28:37.840] But we know from experience that every fucking podcast in the world is currently ramming NordVPN and Express VPN and Proton VPN and all that kind of stuff down your throat.
[00:28:37.840 --> 00:28:43.200] And explicitly advertising it is a way to bypass the geographic blocking of content.
[00:28:43.200 --> 00:28:50.960] Now they're saying it's to get around Netflix geo blocks and things like that, but it works exactly the same way for evading age checks on porn.
[00:28:50.960 --> 00:28:52.960] Are you allowed to make that point?
[00:28:52.960 --> 00:28:54.960] Because I haven't fully read the legislation.
[00:28:54.960 --> 00:28:56.800] I read some of the analysis of it here and there.
[00:28:56.800 --> 00:29:07.200] And part of the thing it was suggesting was it is no longer legal for websites that would be subject to the age verification to list information about how to use a VPN.
[00:29:07.520 --> 00:29:08.480] I don't know.
[00:29:08.480 --> 00:29:09.600] I've got no idea.
[00:29:09.600 --> 00:29:10.080] Okay.
[00:29:10.080 --> 00:29:22.240] But what I did do was I checked the Google Trends data for the UK, and there has been a four-fold increase in searches for the phrase VPN since these provisions came into force.
[00:29:22.480 --> 00:29:27.840] So just in the last week, there is four times as many searches for VPN as there were before that.
[00:29:28.000 --> 00:29:41.200] And that's not just people looking to access things that they shouldn't quote-unquote be accessing, but also people who just don't want to give the credit card details to companies or have their face scanned for companies.
[00:29:41.200 --> 00:29:45.200] So circumvention of these checks is trivial in many cases.
[00:29:45.200 --> 00:29:56.320] Research from Stanford University, which was published in pre-print in March, found that the introduction of age verification laws in some US states resulted in a 24% increase in searches for VPNs.
[00:29:56.320 --> 00:30:06.440] But more worryingly, they also observed a 48% increase in searches for porn sites that were known to not comply with age verification laws.
[00:30:07.400 --> 00:30:17.720] And if they're not complying with some laws, then what else are they not complying with in an industry that is rampant with problems and workers who aren't being adequately protected?
[00:30:17.720 --> 00:30:18.280] Absolutely.
[00:30:18.280 --> 00:30:25.560] So while sites like Pornhub and RedTube and whatever have got these age verifications in place, not all porn sites have.
[00:30:25.560 --> 00:30:28.120] Some of them are just ignoring the law.
[00:30:28.120 --> 00:30:38.360] So while the reputable porn sites, quote-unquote reputable porn sites, like Pornhub might be cooperating with the government on this one, the less reputable websites are not.
[00:30:38.360 --> 00:30:49.480] And the less reputable websites who are less concerned with following the law are where you're going to encounter exactly the kind of harmful and even illegal content that this law was supposed to prevent access to.
[00:30:49.480 --> 00:30:58.760] So the effect of these age verification laws can actually drive consumers, including children, toward harmful images and videos.
[00:31:00.040 --> 00:31:12.280] So we have then good empirical evidence which reveals a significant disconnect between the policy objectives of age verification and the practical outcomes of age verification.
[00:31:12.280 --> 00:31:15.000] And how this plays out within the UK remains to be seen.
[00:31:15.000 --> 00:31:29.880] We don't know how people are going to respond to this yet, but I'm skeptical that this will have the intended effect and may in fact have the opposite effect of driving curious children towards less regulated platforms that have fewer safeguards in place.
[00:31:30.840 --> 00:31:45.680] There's also significant demographic biases in the age verification of non-white faces, or for people, particularly young adults, who are more likely to fail face age verification, have no credit card, and have no government issued ID, and they just can't access this stuff now.
[00:31:46.080 --> 00:32:00.880] It's a major issue for people from all sorts of different backgrounds who can't get access to credit, you know, home on how to people, as you say, people on low income who might not be able to get people who've been in debt before who aren't allowed to get a credit card.
[00:32:01.280 --> 00:32:03.440] How many people does that penalize?
[00:32:03.440 --> 00:32:10.160] Yeah, literally right now, neither my parents possess photo ID, neither of them possess credit cards.
[00:32:10.400 --> 00:32:14.000] And even my mum's boyfriend doesn't have either of those things, I don't think.
[00:32:14.000 --> 00:32:16.240] Maybe he's got a photo driving license, maybe.
[00:32:16.240 --> 00:32:17.760] But certainly my mum wouldn't be able to do it.
[00:32:17.760 --> 00:32:18.640] My dad wouldn't be able to do it.
[00:32:18.720 --> 00:32:21.520] So, like, and they're in their 60s at this point.
[00:32:21.840 --> 00:32:24.160] And we haven't touched on this aspect yet.
[00:32:24.160 --> 00:32:27.680] When you prove your age, you need to do that with some sort of login.
[00:32:29.200 --> 00:32:30.640] You don't do it every time, right?
[00:32:30.640 --> 00:32:33.120] You do it once, and then it's saved against your account.
[00:32:33.120 --> 00:32:40.800] So now you need to sign up for Pornhub or Red Tube or whatever and actually log in to be able to access that content.
[00:32:40.800 --> 00:32:48.320] And while we have talked about the social stigma of having your name associated with porn, there is also the further risk of a data breach on these services.
[00:32:48.320 --> 00:32:54.880] And now your verified personal details and a list of videos you've been watching can be used to blackmail or extort you.
[00:32:54.880 --> 00:32:55.520] Yeah.
[00:32:56.480 --> 00:32:59.360] But we're still not done yet.
[00:32:59.360 --> 00:33:18.560] Because while most of this conversation is around the Online Safety Act is being directed around pornography, the Online Safety Act is not just about porn because there are other kinds of quote-unquote harmful content, which there are legitimate concerns around, like self-harm or pro-anna content or other eating disorder content and that sort of stuff.
[00:33:18.560 --> 00:33:25.520] But the way the law is phrased places a significant regulatory burden on web platforms.
[00:33:25.520 --> 00:33:31.160] So specifically, these provisions apply to what they refer to as user-to-user services.
[00:33:29.840 --> 00:33:36.040] So that's websites where users publish content and other users consume it.
[00:33:36.600 --> 00:33:41.720] So Twitter, Facebook, Discord, Reddit, forums, chat rooms, that sort of thing.
[00:33:42.360 --> 00:33:58.280] And the law requires that operators of those sites have moderation procedures in place to monitor and remove illegal content and also to prevent children from accessing harmful content, even if that content is legal for adults.
[00:33:58.600 --> 00:34:04.680] And that's probably fine when you're Reddit or Discord because you have moderators in place already.
[00:34:04.680 --> 00:34:10.680] But there are also smaller sites operated by individuals which fall into these categories.
[00:34:11.000 --> 00:34:18.280] So furry.energy was a small Mastodon server targeted at furries, particularly within the LGBTQ community.
[00:34:18.280 --> 00:34:29.720] The owners of that website shut it down in March because the volunteer owner was not confident that they had the resources to police the content to the extent required by law and couldn't afford the fines if they made a mistake.
[00:34:29.720 --> 00:34:35.960] The fines for this are up to 18 million pounds or 10% of your worldwide gross revenue.
[00:34:35.960 --> 00:34:37.080] Yeah, yeah, yeah.
[00:34:37.080 --> 00:34:40.120] So the fines are absolutely enormous.
[00:34:40.120 --> 00:34:44.840] Dads with Kids, which is a forum for single fathers, closed for the same reason.
[00:34:44.840 --> 00:34:47.880] Moderators say we can't afford the fines if we get this wrong.
[00:34:47.880 --> 00:34:52.440] Green Living, a forum about sustainable living, again, closed for the same reason.
[00:34:52.440 --> 00:34:54.920] We can't afford the fines if we get this wrong.
[00:34:54.920 --> 00:35:00.840] Genuinely, the question would be: there is a Discord associated with skeptics in the pub online.
[00:35:00.840 --> 00:35:01.240] Yeah.
[00:35:01.240 --> 00:35:02.120] Is that under risk?
[00:35:02.120 --> 00:35:02.840] Is that under threat?
[00:35:02.840 --> 00:35:04.920] Genuinely, well, we paused this short to it.
[00:35:04.920 --> 00:35:06.280] I don't know whether it would be.
[00:35:06.280 --> 00:35:11.240] It shouldn't be because Discord have moderators in place for that sort of stuff.
[00:35:11.240 --> 00:35:13.320] So you have to go to the big platforms now.
[00:35:13.320 --> 00:35:14.960] You can't just have old forums.
[00:35:14.960 --> 00:35:19.360] So there was a skeptics forum, not like UK Skeptics Forum for a while.
[00:35:14.760 --> 00:35:19.520] Yeah.
[00:35:19.760 --> 00:35:24.480] That would be sensible to close down because it will be massively under this legislation.
[00:35:24.720 --> 00:35:25.360] Yeah.
[00:35:25.680 --> 00:35:33.600] And these kind of forums and chat rooms are unlikely to have been hosting the kind of harmful and illegal content that this legislation was meant to address.
[00:35:33.600 --> 00:35:40.960] But the required risk assessments and the harsh penalties were enough to chill those users into closing down their websites.
[00:35:40.960 --> 00:35:46.160] And this disproportionately impacts niche groups like lesbian, gay, and trans groups.
[00:35:46.160 --> 00:35:58.400] And has the knock-on effect of further centralizing power with large online platforms who can afford moderators, but often can afford moderators because they're harvesting customers' personal information to sell them ads.
[00:35:58.400 --> 00:35:59.200] Yes, yeah, yeah.
[00:35:59.200 --> 00:36:08.480] And the reason it particularly harms niche groups is because just being gay is considered, could be easily be pulled into the consideration of that's mature content.
[00:36:08.480 --> 00:36:08.800] Absolutely.
[00:36:08.880 --> 00:36:13.280] The discussion of being gay counts in that case as something we need to protect children from.
[00:36:13.280 --> 00:36:14.320] Yes, it does.
[00:36:14.640 --> 00:36:22.160] So what we're doing here is further exacerbating one problem in the name of solving another problem and solving it ineffectually.
[00:36:22.160 --> 00:36:36.800] So while these systems may satisfy the political demands to be seen to be doing something about online child safety, the research actually consistently demonstrates that these mechanisms fail to meaningfully protect children.
[00:36:37.120 --> 00:36:46.240] Not only that, but we're actually seeing even amongst the large providers the age gating of all content that they deem mature, even if it is not harmful.
[00:36:46.240 --> 00:36:47.680] Speaking to your point, Marsh.
[00:36:47.680 --> 00:36:48.160] Yeah.
[00:36:48.160 --> 00:36:57.840] Reddit, for example, has put age verification demands on support forums for sexual assault survivors, for queer and trans discussion forums, for eating disorder support forums.
[00:36:57.840 --> 00:37:11.080] And so now teenagers are being excluded from those spaces by these age gates, despite teenagers being at greatest risk, for example, of developing an eating disorder and most in need of that kind of support.
[00:37:12.040 --> 00:37:19.640] Also caught in the middle of this is Wikipedia, because Wikipedia lets some users write content and other users read content.
[00:37:19.640 --> 00:37:40.680] So it technically qualifies as a user-to-user service under the terms of the Online Safety Act, meaning that Wikipedia may be required to prevent children from seeing quote-unquote harmful content, which could mean placing age verification walls on certain articles, requiring users to log in to be able to read Wikipedia, that sort of thing.
[00:37:41.000 --> 00:37:44.200] Wikipedia have proactively taken action to prevent this.
[00:37:44.200 --> 00:37:54.680] They've said we are not going to do any of these things, and they've brought this to judicial review to ensure that they are not classified in such a way that means they will be expected to put these moderation procedures in place.
[00:37:54.680 --> 00:38:01.720] Because when people turn to ChatGPT to answer search queries anyway, that kills Wikipedia in the UK.
[00:38:01.720 --> 00:38:05.240] People will just stop using it if you have to log in to find stuff on Wikipedia.
[00:38:05.240 --> 00:38:08.360] You'll just go to ChatGPT and ask a question instead.
[00:38:08.360 --> 00:38:09.800] And it's not just Wikipedia.
[00:38:09.800 --> 00:38:15.320] WhatsApp, Signal, iMessage, Telegram, Session, all these encrypted messaging services.
[00:38:15.320 --> 00:38:25.960] The Online Safety Act requires that providers of end-to-end encrypted communication provide the British government with a back door to access the encrypted data, just in case it's something harmful.
[00:38:25.960 --> 00:38:35.480] Just in case it's terrorism or child sex abuse or whatever, they have to have the means to access everyone's encrypted private communications just in case.
[00:38:35.480 --> 00:38:43.800] Yeah, it's essentially the digital version of the police are allowed to strip you naked on the street just in case you're concealing something that they need to be worried about.
[00:38:43.800 --> 00:38:47.360] When Theresa May's government tried to introduce this, it was referred to as the Snoopers Charter.
[00:38:47.360 --> 00:38:47.760] Yeah.
[00:38:47.760 --> 00:38:49.360] And that's very much how it's being used.
[00:38:49.360 --> 00:38:57.440] It's what's surprising to me is it's being used this way by a Labour government when the Tories actually didn't do anything with this, even after it had passed.
[00:38:57.440 --> 00:38:57.760] Yeah.
[00:38:57.760 --> 00:39:01.040] I wish it was surprising to me currently.
[00:39:01.040 --> 00:39:12.720] But this has caused a terrific amount of disquiet in the software community because those end-to-end encrypted systems are designed in such a way that even the operator of the system cannot access the content.
[00:39:12.720 --> 00:39:18.640] Apple have already withdrawn some of their newer privacy products from the UK because of this legislation.
[00:39:18.640 --> 00:39:22.320] Not specifically the Online Safety Act, but it's related to it.
[00:39:22.320 --> 00:39:32.880] So their advanced data protection service was withdrawn from the UK earlier this year after the UK government demanded that they modify it to allow law enforcement to access encrypted data.
[00:39:32.880 --> 00:39:39.840] Apple's argument, Apple's response to that, which is a valid argument, is there is no such thing as a backdoor for the good guys only.
[00:39:39.840 --> 00:39:40.480] Yes, yeah.
[00:39:40.480 --> 00:39:42.480] Once it exists, it can be exploited.
[00:39:42.480 --> 00:39:50.640] And how would the British government feel about the government of a hostile foreign power demanding access to the encrypted communications of Keir Starmer?
[00:39:50.640 --> 00:39:51.040] Yeah.
[00:39:51.040 --> 00:39:56.080] You know, because it would be exactly the same mechanisms that would be used for that.
[00:39:56.960 --> 00:40:08.800] Now, the government has promised that it will not try to enforce the provisions of the Online Safety Act that require companies to break encryption until it is technically feasible for them to do so.
[00:40:09.120 --> 00:40:11.280] But that is also a promise.
[00:40:11.280 --> 00:40:12.320] Yes, bullshit.
[00:40:12.320 --> 00:40:16.720] There's no safety mechanisms in the legislation that will hold the government to that.
[00:40:16.720 --> 00:40:20.080] They're just saying, we swear we won't do anything about this.
[00:40:20.080 --> 00:40:25.520] And that's just a policy change away from them starting to sue people for using encrypted chats.
[00:40:25.520 --> 00:40:29.600] Yeah, especially when what they're asking for is almost definitionally not possible.
[00:40:29.600 --> 00:40:29.960] Yes.
[00:40:29.960 --> 00:40:30.520] So yeah, we will.
[00:40:30.600 --> 00:40:33.320] Because it's not end-to-end encrypted anymore if there's a backdoor into it.
[00:40:29.680 --> 00:40:36.760] Yeah, we will not enforce this policy until unicorns are real.
[00:40:37.000 --> 00:40:37.480] Yes.
[00:40:37.480 --> 00:40:39.320] You're not going to stick to that.
[00:40:39.320 --> 00:40:53.160] Various online providers have said that they would sooner withdraw from the UK market rather than compromise their own products, which means that these requirements, if enforced, would effectively outlaw encrypted communication in the UK.
[00:40:53.480 --> 00:40:54.760] Which is a lot, right?
[00:40:54.760 --> 00:40:55.320] Yeah.
[00:40:55.640 --> 00:40:59.720] So the last thing I want to talk about is the problem itself.
[00:41:00.040 --> 00:41:07.800] It is true that we are facing a generation of children who have instant access to the most extreme forms of pornography.
[00:41:07.800 --> 00:41:11.160] When I was a kid, porn was something you found in a bush in the park.
[00:41:11.160 --> 00:41:11.480] Yes.
[00:41:11.480 --> 00:41:11.960] Right?
[00:41:11.960 --> 00:41:16.200] And it amounted to a lady with big boobs smiling while looking dead behind the eyes.
[00:41:16.600 --> 00:41:18.840] That was what porn in the 90s was.
[00:41:18.840 --> 00:41:21.000] And it's the only thing you can get off to now.
[00:41:22.200 --> 00:41:24.920] It just had a deep effect on the only thing that works.
[00:41:25.240 --> 00:41:34.360] But even when I was in my late teens and able to access internet pornography for the first time, internet pornography was a very different beast in late 90s, early 2000s.
[00:41:34.360 --> 00:41:39.480] Content featuring anal sex was like the hardcore stuff, was the scary stuff.
[00:41:39.480 --> 00:41:42.920] And now anal is treated as like a given as table stakes.
[00:41:42.920 --> 00:41:47.000] Well, obviously you do anal, but what do you do for the really hard stuff?
[00:41:47.640 --> 00:41:59.000] There has also been what you might refer to as the BDSMification of porn, where porn that would have been considered niche BDSM content is now considered mainstream content.
[00:41:59.000 --> 00:42:19.440] So having faces slapped and hair-pulled, and degradation, and name-calling, which was once the preserve of the FET community and enacted within a framework of negotiation and understanding and consent and safeguards and give and take, is now presented without any of that framework, at least in how it's depicted on screen, as if this is what sex is normally like.
[00:42:20.640 --> 00:42:30.000] And we are seeing this translating to how teenagers, especially young women, are being treated by their partners during their first tentative sexual experiences.
[00:42:30.000 --> 00:42:32.400] This is not an imaginary problem.
[00:42:32.400 --> 00:42:40.240] The immediate access to porn seems to be the proximate cause of this change in teenagers' shift in attitudes.
[00:42:40.640 --> 00:42:42.320] Certainly one of the proximate causes.
[00:42:42.320 --> 00:42:43.040] Yes.
[00:42:43.360 --> 00:42:50.720] But I don't think that gating adult content in the way that the Online Safety Act tries to is going to be an effective counter to this.
[00:42:51.040 --> 00:42:55.360] What I think would be an effective counter to this is improved sex education.
[00:42:56.320 --> 00:43:00.800] Kids aren't watching RoboCop and thinking that's how policing works, right?
[00:43:00.800 --> 00:43:03.920] They understand the difference between fantasy and reality.
[00:43:03.920 --> 00:43:13.360] And what they need to understand is that porn represents a fantasy world that is not reflective of real sex or typical sexual relationships.
[00:43:13.680 --> 00:43:25.840] But ironically, the pearl clutchers who have pushed for this kind of legislation over the last decade or so are also the same ones who argue against comprehensive sex education about sex and sexuality in schools.
[00:43:25.840 --> 00:43:26.240] Yeah, absolutely.
[00:43:26.400 --> 00:43:28.080] Because you can't teach kids about that.
[00:43:28.320 --> 00:43:29.680] They're not ready for it.
[00:43:29.680 --> 00:43:36.880] Which means curious kids will go out and seek the information for themselves and find what they think are the answers in porn.
[00:43:37.920 --> 00:43:39.360] And there's data to support this.
[00:43:39.360 --> 00:43:44.560] So one pilot study from 2020 ran a porn literacy program with teenagers.
[00:43:44.560 --> 00:43:49.600] They taught those teenagers to spot the scripted and unrealistic stuff in porn.
[00:43:49.600 --> 00:43:53.360] They taught them to understand that real relationships don't look like this.
[00:43:53.360 --> 00:44:01.080] And they did that while framing it in a non-judgmental way and without saying you should be discouraged from watching porn.
[00:44:02.200 --> 00:44:14.280] And after taking part in that program, the teens were less likely to buy into harmful stereotypes that porn promotes and they were less likely to buy into the idea that certain behaviors are normal or expected.
[00:44:14.280 --> 00:44:28.600] So from the paper, quote, participants learned that not everyone enjoys being called names like slut during sex and they need to ask partners for consent before each new sexual act that they may want to try during a sexual encounter.
[00:44:28.600 --> 00:44:40.040] For example, anal sex, hair pulling, and spanking would each require separate consent and may not be as widely enjoyed by women as they might presume after having watched mainstream pornography.
[00:44:40.040 --> 00:44:41.320] Unquote.
[00:44:41.960 --> 00:44:56.040] What this paper found was that before the workshop, 44% of those attending said that watching porn made them want to emulate what they saw, and 26% of those attending said they thought porn was a realistic depiction of sex.
[00:44:56.360 --> 00:45:04.440] After the workshop, those numbers reduced to 29% who wanted to emulate what they had seen, down from 44%.
[00:45:04.440 --> 00:45:09.640] And the number who thought that porn was a realistic depiction of sex dropped to zero.
[00:45:09.960 --> 00:45:15.800] So after the workshop, none of them agreed was a realistic depiction of sex.
[00:45:16.440 --> 00:45:18.840] They also looked at adverse events.
[00:45:18.840 --> 00:45:31.640] So one of the adverse events they looked at was whether the teenagers sought to access more porn after the workshop, because that's a common objection to these kind of workshops: telling teenagers about porn will make them want to go and seek it out.
[00:45:31.640 --> 00:45:37.480] But in follow-up interviews, they found that the number who reported using porn did not change before and after the session.
[00:45:37.640 --> 00:45:38.520] Of course, it wouldn't change.
[00:45:38.520 --> 00:45:40.360] Teenagers want to see porn.
[00:45:40.600 --> 00:45:42.520] It doesn't matter whether you give them access or not.
[00:45:42.520 --> 00:45:42.920] They want to.
[00:45:42.920 --> 00:45:45.000] That's the whole fucking problem here, isn't it?
[00:45:45.520 --> 00:45:56.000] Nor did any of those involved report feeling anxious or upset by the content of the workshop, which is the other thing that, you know, maybe if you're a sexual abuse survivor or something like that, that might be a difficult thing for you to do.
[00:45:56.000 --> 00:45:59.520] None of those adverse events were reported in this paper.
[00:45:59.520 --> 00:46:04.800] Now, this paper has got very small numbers, 31 participants, no control group.
[00:46:04.800 --> 00:46:06.800] The follow-up was quite short-term.
[00:46:06.800 --> 00:46:09.040] Good reasons to be skeptical of these figures.
[00:46:09.040 --> 00:46:16.880] But my understanding, researching around this topic for the show, is that these findings are not untypical in this sort of research.
[00:46:16.880 --> 00:46:23.680] Many, many other papers find very similar things with porn literacy programs, is that they have these sorts of effects.
[00:46:24.320 --> 00:46:34.320] So while the Online Safety Act attempts to tackle the problem of children accessing harmful content, the tools and methods it relies on are flawed and prone to failure.
[00:46:34.320 --> 00:46:40.560] Age verification technology struggles with accuracy and with bias and can be easily circumvented.
[00:46:40.560 --> 00:46:52.960] The legislation's impact on smaller and often vulnerable communities and platforms risks further centralization of power in the hands of a few large companies, often at the cost of privacy and accessibility.
[00:46:52.960 --> 00:47:04.160] And more importantly, the focus on blocking access misses the bigger picture, which is children need better education to understand what they are seeing and to navigate their own sexuality safely.
[00:47:04.160 --> 00:47:14.880] The evidence, limited though it is, suggests that porn literacy programs reduce harmful misconceptions without causing the unintended harms of censorship and surveillance.
[00:47:15.520 --> 00:47:19.920] So in the end, I think the Online Safety Act is tackling the wrong problem.
[00:47:19.920 --> 00:47:29.760] The question isn't how to keep teenagers away from porn, it's how to prepare them to engage with the real world where sex and relationships are more complex than what they're going to see on the screen.
[00:47:30.120 --> 00:47:35.160] And any law that tries to simply slam the door shut is going to fall short.
[00:47:39.320 --> 00:47:40.920] I went to see Superman.
[00:47:40.920 --> 00:47:41.720] Oh, right, yes.
[00:47:41.720 --> 00:47:43.160] I went to see the new Superman film.
[00:47:43.160 --> 00:47:47.080] I've long since given up on DC films because they've all been shit for a long time.
[00:47:47.080 --> 00:47:50.040] They have, and I don't particularly like Superman as an idea.
[00:47:50.040 --> 00:47:51.160] I don't find it interesting.
[00:47:51.160 --> 00:47:53.560] He's just a man who does almost everything and is in Rumble.
[00:47:53.720 --> 00:47:54.040] Yeah.
[00:47:54.280 --> 00:48:04.600] And there's been a problem with Superman, and this problem has been since the Christopher Reed version of Superman, is that Superman is absurdly overpowered to the point where you've got two plots in Superman, right?
[00:48:04.600 --> 00:48:07.400] You've got Kryptonite and Bad Superman.
[00:48:07.400 --> 00:48:07.720] Yep.
[00:48:07.720 --> 00:48:09.720] And those are the only two plots that you've got.
[00:48:09.720 --> 00:48:11.240] You've got Kryptonite and Bad Superman.
[00:48:11.240 --> 00:48:13.960] So in the Chris Reeve film, Superman 1, Kryptonite.
[00:48:13.960 --> 00:48:15.720] Superman 2, Bad Superman.
[00:48:15.960 --> 00:48:17.720] Superman 3, Bad Superman.
[00:48:17.720 --> 00:48:19.560] Superman 4, Bad Superman.
[00:48:19.560 --> 00:48:20.680] That's the only plot you've got.
[00:48:21.480 --> 00:48:25.320] The Brandon Routh one with Kevin Spacey in it, that was Kryptonite.
[00:48:25.320 --> 00:48:25.800] Don't remember.
[00:48:26.680 --> 00:48:30.440] It's the Man of Steel, the Henry Capel one, Bad Superman.
[00:48:30.440 --> 00:48:30.760] Didn't watch it.
[00:48:31.160 --> 00:48:31.480] That's it.
[00:48:31.480 --> 00:48:32.200] You've got two plots.
[00:48:34.200 --> 00:48:35.320] Which one is this?
[00:48:35.320 --> 00:48:39.480] It turns out, I don't want to put spoilers in for Superman, but it turns out it's both.
[00:48:42.200 --> 00:48:46.200] So, Superman's a very overpowered character, and that's a problem.
[00:48:46.200 --> 00:48:52.120] And they recognized this in the comics, and they did a long-running comics plotline, which I think is Crisis on Infinite Earth.
[00:48:52.280 --> 00:48:59.800] I might be wrong, I'm not a DC comics guy, where they significantly nerfed Superman because they recognized he was too powerful and it made for boring plots.
[00:49:01.320 --> 00:49:05.400] This happened after the Chris Reeve film had been done.
[00:49:05.400 --> 00:49:14.960] And so, movie Superman is still the ridiculously overpowered Superman from the 70s, even though the comics have kind of backed him off because they realized it made for shit stories.
[00:49:14.440 --> 00:49:16.320] And that's carried through.
[00:49:16.400 --> 00:49:24.720] It carried through all the way through to the Henry Cavill Superman, who again is ridiculously overpowered, probably even more so than the Christopher Reeve one.
[00:49:24.720 --> 00:49:27.360] But for the new one, they've really backed it off.
[00:49:27.360 --> 00:49:27.920] Okay.
[00:49:27.920 --> 00:49:30.320] So he's much more vulnerable character.
[00:49:30.320 --> 00:49:34.640] The world is full of other superhero characters who give him a serious run for his money.
[00:49:34.640 --> 00:49:37.040] And it's a much more fun Superman as well.
[00:49:37.040 --> 00:49:43.920] Superman's always portrayed as very po-faced and very, you know, it's all very serious and it's very serious business being Superman.
[00:49:43.920 --> 00:49:45.440] And this is a fun Superman.
[00:49:45.440 --> 00:49:50.640] So after having not watched any of the DC, and I wasn't planning to watch this one until the reviews actually said this was quite good and crazy.
[00:49:50.800 --> 00:49:51.920] I saw the reviews thought it was decent.
[00:49:52.400 --> 00:49:56.080] So I went out and watched Nicholas Holt is really good as Lex Luther.
[00:49:56.080 --> 00:49:57.120] He's good in lots of things.
[00:49:57.120 --> 00:49:58.080] He's fantastic in it.
[00:49:58.080 --> 00:50:01.600] He was beast in the young cast of X-Men.
[00:50:01.600 --> 00:50:03.280] Yeah, and he was in skins, wasn't he?
[00:50:03.280 --> 00:50:03.600] Wasn't he?
[00:50:04.400 --> 00:50:06.880] I thought he could start in skins, I think.
[00:50:06.880 --> 00:50:08.480] But yeah, I really enjoyed Superman.
[00:50:08.480 --> 00:50:09.040] It was good.
[00:50:09.040 --> 00:50:14.960] And it was silly and fun in a way that Superman just hasn't been in the movies for a long time.
[00:50:14.960 --> 00:50:16.560] So I definitely recommend it.
[00:50:17.120 --> 00:50:18.000] It was worth going.
[00:50:18.000 --> 00:50:18.800] I went with Emma.
[00:50:18.800 --> 00:50:20.320] We went to see it at The Fact.
[00:50:20.480 --> 00:50:22.480] And there was no one else went to see it at The Fact.
[00:50:22.640 --> 00:50:23.760] That's often the case with facts.
[00:50:23.760 --> 00:50:24.240] I like facts.
[00:50:24.880 --> 00:50:30.960] When Emma booked the tickets, there was just one person that had already booked a seat and they booked a kind of middle-middle seat.
[00:50:31.600 --> 00:50:35.200] And so we did toy with the idea of booking the seat either side of them.
[00:50:35.360 --> 00:50:36.160] Toward it was good.
[00:50:37.520 --> 00:50:38.400] She fucked them off.
[00:50:38.560 --> 00:50:39.920] They wouldn't let you do that in one booking.
[00:50:39.920 --> 00:50:41.840] I think you'd have to do two bookings for that.
[00:50:42.560 --> 00:50:44.000] Don't really fuck this person off.
[00:50:44.400 --> 00:50:45.600] It'll be a laugh.
[00:50:45.600 --> 00:50:51.600] But for some reason, almost every advert, and I think this is a fact thing rather than a Superman thing.
[00:50:51.600 --> 00:50:53.760] So many adverts were for cars.
[00:50:53.800 --> 00:50:55.920] Yeah, well, they're sponsored by Kia, aren't they?
[00:50:56.080 --> 00:50:57.680] Pidger House is sponsored by Kia.
[00:50:57.680 --> 00:51:01.400] So there was the Kia one that was very close to the film, right?
[00:50:59.920 --> 00:51:07.080] So you had the generic ads, then trailers, then the Kia ad, and then the films.
[00:51:07.240 --> 00:51:08.280] Yes, yeah, yeah, that's what we have.
[00:51:09.160 --> 00:51:11.640] A lot of the generic ads were fucking cars as well.
[00:51:11.640 --> 00:51:24.760] It's like, well, when it used to be for films, it was all cars and perfume, and just maybe cars of car companies are the only ones with the money left for to be able to do cinema-based advertising these days, you know.
[00:51:24.760 --> 00:51:26.360] But yeah, it was so, but it was fun.
[00:51:26.360 --> 00:51:26.840] I enjoyed it.
[00:51:26.840 --> 00:51:29.960] It was, I would definitely recommend going to see going to see Superman.
[00:51:29.960 --> 00:51:33.320] Not as good as 28 years later, which I think might be my favorite film.
[00:51:33.320 --> 00:51:34.280] I've not seen it.
[00:51:35.080 --> 00:51:36.440] I could not get Nicol to watch it.
[00:51:36.440 --> 00:51:38.840] No, she's not interested at all.
[00:51:39.160 --> 00:51:41.400] Yeah, I will watch it at some point.
[00:51:41.400 --> 00:51:42.040] It's a good film.
[00:51:42.360 --> 00:51:43.000] We'll watch it with you.
[00:51:43.000 --> 00:51:43.960] I've not been to cinema.
[00:51:43.960 --> 00:51:47.800] I did watch the women's Euros, which has been very, very exciting.
[00:51:47.800 --> 00:51:48.680] Yes, I saw that.
[00:51:48.680 --> 00:51:50.040] Was lots of people talking about that.
[00:51:50.040 --> 00:51:51.320] Lots of people at work excited about that.
[00:51:51.480 --> 00:51:52.280] Yeah, it was excellent.
[00:51:52.280 --> 00:51:53.880] So England has won again.
[00:51:53.880 --> 00:51:55.160] So they've won twice.
[00:51:55.160 --> 00:51:56.840] They won it back-to-back, basically.
[00:51:56.840 --> 00:51:57.400] 2022.
[00:51:57.560 --> 00:51:58.440] Which does not happen.
[00:51:58.440 --> 00:52:01.880] Yeah, which should have been the 2021 tournament, but it was delayed by.
[00:52:01.880 --> 00:52:05.880] And then three years later, they won it this year, which was very good.
[00:52:05.880 --> 00:52:06.840] It was very exciting.
[00:52:06.840 --> 00:52:13.240] Lots of extra time, lots of kind of like tense kind of games, lots of going behind the quarterfinals and the semifinals.
[00:52:13.240 --> 00:52:20.040] Across the quarters, the semis, and the final, England were in the lead, cumulatively, for four and a half minutes.
[00:52:20.040 --> 00:52:20.520] Okay.
[00:52:20.520 --> 00:52:23.160] They played 360 minutes in those three games.
[00:52:23.160 --> 00:52:25.400] Cumulatively, they were ahead for four and a half minutes.
[00:52:25.640 --> 00:52:28.280] That's how kind of exciting and mad the whole thing was.
[00:52:28.280 --> 00:52:31.080] But the most, the maddest thing about it, I actually love the team.
[00:52:31.080 --> 00:52:32.680] I absolutely love so many of the players in it.
[00:52:32.680 --> 00:52:38.200] Lucy Bronze, the right back, is ridiculously, impressively nails.
[00:52:38.200 --> 00:52:41.320] She's hard as nails, like insanely sore, ridiculously sore.
[00:52:41.560 --> 00:52:45.520] Right down to the point where Tough is very literally her middle name.
[00:52:44.680 --> 00:52:48.800] She was born Lucia Roberta Tough Bronze.
[00:52:50.320 --> 00:52:54.960] But she went off with an injury in the second half and halftime in extra time.
[00:52:54.960 --> 00:53:03.200] So she played the 90 minutes and then 15 minutes and she kept going down with an injury to a leg and she's literally limping around while the ball is out of play.
[00:53:03.200 --> 00:53:04.880] And then when the ball's in place, she's running again.
[00:53:04.880 --> 00:53:05.920] You're like, what are you doing?
[00:53:05.920 --> 00:53:06.400] Just go off.
[00:53:06.400 --> 00:53:07.440] This is crazy.
[00:53:07.440 --> 00:53:13.440] And then in the interview at the end, they said to her, oh, and you had that injury to your right leg.
[00:53:13.440 --> 00:53:20.800] And you were, she's during the celebrations and everything, she's fully leg strapped up and she's hopping on her left leg because she can't bend her right leg anymore.
[00:53:20.800 --> 00:53:22.000] It's completely nagged.
[00:53:22.000 --> 00:53:24.640] And she said, they're saying, oh, so you're how's the how's the leg?
[00:53:24.640 --> 00:53:25.200] Is it all right?
[00:53:25.280 --> 00:53:27.520] She says, yeah, it's not, it's, it's bad.
[00:53:27.520 --> 00:53:33.120] But what I didn't say was throughout this entire tournament, I've been playing with a broken fibia in the other leg.
[00:53:33.120 --> 00:53:34.240] In the other leg?
[00:53:34.240 --> 00:53:36.560] So she's tibia, sorry.
[00:53:36.560 --> 00:53:40.800] So she's played the entirety of the tournament and won the tournament with a fractured tibia.
[00:53:40.800 --> 00:53:41.280] Fucking hell.
[00:53:42.160 --> 00:53:44.240] And they were like, wow, that must have been painful.
[00:53:44.640 --> 00:53:47.920] Yeah, it was incredibly painful constantly every single second of the game.
[00:53:48.320 --> 00:53:51.840] So she's now done a knee in the uninjured leg.
[00:53:51.840 --> 00:53:56.000] Presumably because she's been carrying a weight funny.
[00:53:56.000 --> 00:53:58.240] And when they said that, she was like, she laughed.
[00:53:58.240 --> 00:54:00.480] She was like, yeah, it's been insanely painful.
[00:54:00.720 --> 00:54:01.760] What is wrong with you?
[00:54:01.760 --> 00:54:02.880] You are not human.
[00:54:02.880 --> 00:54:03.280] Yeah.
[00:54:03.280 --> 00:54:04.640] Incredible.
[00:54:08.160 --> 00:54:15.920] So for QED, we've got the announcement of two new podcasts that are going to be appearing in the QED podcast room at QED.
[00:54:15.920 --> 00:54:19.360] The first of those is, of course, going to be Incredulous.
[00:54:19.360 --> 00:54:23.440] Incredulous has been on the QED main stage for the whole run of QED.
[00:54:23.440 --> 00:54:26.640] It has every year of QED so far, right up until the last one.
[00:54:27.600 --> 00:54:28.760] We're going to have an Incredulous.
[00:54:28.800 --> 00:54:30.280] Yep, which should be very, very fun.
[00:54:30.280 --> 00:54:31.160] It's always a good time.
[00:54:31.160 --> 00:54:32.120] It's always fun.
[00:54:32.120 --> 00:54:33.240] It's going to be a fantastic time.
[00:54:33.240 --> 00:54:34.360] We're looking forward to that.
[00:54:29.840 --> 00:54:36.520] And that, of course, is going to be hosted by Andy Wilson.
[00:54:36.760 --> 00:54:39.160] So you should definitely come along and watch that.
[00:54:39.160 --> 00:54:42.520] The other podcast that we've got is the No Rogan Experience.
[00:54:42.520 --> 00:54:47.560] Yes, so Cecil and I are going to do a live show of the No Rogan experience at QED.
[00:54:47.560 --> 00:54:50.680] So that is, for listeners who aren't aware, I don't mention it too often.
[00:54:50.680 --> 00:54:51.800] I did when I first launched it.
[00:54:51.800 --> 00:55:13.640] I do a show with Cecil Cicarello from Citation Needed and Cognitive Dissonance, where we watch episodes of Joe Rogan and explain where the errors in thinking are, where the conspiracy is being pushed, whether there's facts that are incorrect, and essentially what is happening on the biggest podcast in the world that's influencing huge, huge numbers of people.
[00:55:13.640 --> 00:55:14.600] What's actually going on?
[00:55:14.600 --> 00:55:20.760] And what should we know about it as people who are trying to promote a better way of addressing the world and thinking about stuff?
[00:55:20.760 --> 00:55:21.800] So we're going to do a live show.
[00:55:22.040 --> 00:55:35.720] We're thinking it's going to be a slightly different structure to a normal episode, but I think we can do something with the 45 to 50 minute space in the room that should be interesting and give a good feel for what we're doing and should be a good live experience.
[00:55:35.720 --> 00:55:39.240] So yeah, we're excited to do our first live show at QED.
[00:55:39.240 --> 00:55:40.600] That's very exciting.
[00:55:40.600 --> 00:55:58.680] So if you would like to watch the live podcast stream at QED, you can do so by going to QDCon.org and purchase an online ticket that is £49 and gets you access to the main stage content, the panel room content, and the live podcasts across the entire weekend.
[00:55:59.320 --> 00:56:01.920] Including this show live, including a live skeptics with a k included.
[00:56:02.040 --> 00:56:04.040] It's going to be a live version of this show as well.
[00:56:04.040 --> 00:56:07.800] And you can find more information about QED at QEDcon.org.
[00:56:08.120 --> 00:56:10.040] I should also plug our Patreon.
[00:56:10.040 --> 00:56:18.480] So, if you enjoy the show, if you enjoy what we do and you would like to support us, you can do that by going to patreon.com forward slash skeptics with a K, where you can donate for as little as a pound a month.
[00:56:18.800 --> 00:56:27.680] That money helps us, it helps with the running costs of the show, and also it gives you access to an ad-free version of this podcast.
[00:56:28.000 --> 00:56:30.640] Similarly, Merseyside Skeptics also have a Patreon.
[00:56:30.800 --> 00:56:33.360] You can access that at patrion.com forward slash Merseyskeptics.
[00:56:33.360 --> 00:56:40.560] And if you like what we're doing and you want to support the Merseyside Skeptics, you can do that by going to their Patreon, where you will also get an ad-free version of this show.
[00:56:40.560 --> 00:56:41.360] Yep.
[00:56:41.360 --> 00:56:43.840] Aside from that, then, I think that is all we have time for.
[00:56:43.840 --> 00:56:44.560] I think it is.
[00:56:44.560 --> 00:56:47.280] All that remains then is for me to thank Marsh for coming along today.
[00:56:47.280 --> 00:56:47.840] Cheers.
[00:56:47.840 --> 00:56:48.800] Thank you to Alice.
[00:56:48.800 --> 00:56:49.280] Thank you.
[00:56:49.280 --> 00:56:51.840] We have been Skeptics with a K, and we will see you next time.
[00:56:51.840 --> 00:56:52.560] Bye now.
[00:56:52.560 --> 00:56:53.600] Bye.
[00:56:58.400 --> 00:57:03.440] Skeptics with a K is produced by Skeptic Media in association with the Merseyside Skeptic Society.
[00:57:03.440 --> 00:57:12.720] For questions or comments, email podcast at skepticswithakay.org, and you can find out more about Merseyside Skeptics at mercyside skeptics.org.uk.