Debug Information
Processing Details
- VTT File: swak424.vtt
- Processing Time: September 11, 2025 at 03:07 PM
- Total Chunks: 1
- Transcript Length: 63,277 characters
- Caption Count: 594 captions
Prompts Used
Prompt 1: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 1 of 1 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
[00:00:00.640 --> 00:00:06.800] This Labor Day at Value City Furniture, get up to 20% off your new living room and more throughout the store.
[00:00:06.800 --> 00:00:08.640] Think you deserve even more?
[00:00:08.640 --> 00:00:09.280] Same.
[00:00:09.280 --> 00:00:13.200] Like combining those savings with an extra 10% off door busters.
[00:00:13.200 --> 00:00:16.320] Plus, no interest financing for up to 40 months.
[00:00:16.320 --> 00:00:19.520] See if you pre-qualify without touching your credit score.
[00:00:19.520 --> 00:00:23.840] Yep, get more at Value City Furniture while paying less.
[00:00:23.840 --> 00:00:26.800] More style, more quality, more value.
[00:00:26.800 --> 00:00:29.280] We are Value City Furniture.
[00:00:36.960 --> 00:00:45.840] It is Thursday, the 24th of July, 2025, and you're listening to Skeptics with a K, the podcast for science, reason, and critical thinking.
[00:00:45.840 --> 00:00:56.720] Skeptics with a K is produced by Skeptic Media in association with the Merseyside Skeptic Society, a non-profit organization for the promotion of scientific skepticism on Merseyside around the UK and internationally.
[00:00:56.720 --> 00:00:58.080] I'm your host, Mike Hall.
[00:00:58.080 --> 00:00:59.120] With me today is Marsh.
[00:00:59.120 --> 00:00:59.600] Hello.
[00:00:59.600 --> 00:01:00.320] And Alice.
[00:01:00.320 --> 00:01:01.280] Hello.
[00:01:01.600 --> 00:01:09.520] So we've spoken on the show several times about the way large internet companies use the web to watch what you're doing.
[00:01:09.520 --> 00:01:21.920] Companies like Google and Facebook aggressively will take steps to monitor your online activities, see which websites you're looking at, how long you're looking at them for, how often you're looking at them.
[00:01:21.920 --> 00:01:31.520] And then they match these findings with personal information about you, your name, your address, your date of birth, et cetera, and aggregate all of this together and then they sell that to advertisers.
[00:01:31.520 --> 00:01:33.200] Is that something Amazon's doing as well?
[00:01:33.520 --> 00:01:34.400] Amazon does this as well.
[00:01:36.080 --> 00:01:39.920] And that advertising is not necessarily just for products and services.
[00:01:39.920 --> 00:02:01.640] I think it's bad enough if it was just for products and services, but it's also used for political advertising, which knowingly directs disinformation and misinformation at vulnerable people in order to sway their political opinions and targeting people who are likely to receive that message well and be influenced by it by using that information, those profiles that they've given up.
[00:01:59.680 --> 00:02:05.560] And that sways their political positions and therefore their vote.
[00:01:59.840 --> 00:02:06.040] Yep.
[00:02:07.240 --> 00:02:18.120] And much of this criticism has been leveled at Facebook, who have repeatedly and systematically worked to sidestep the controls that were put in place to protect individual privacy.
[00:02:18.120 --> 00:02:23.480] So they wiretapped their customers using a VPN to spy on their Snapchat and YouTube data.
[00:02:23.480 --> 00:02:32.120] They lied to users who gave them their phone number by saying this is just for you to log into your account, but then they used it for their advertising data anyway.
[00:02:32.440 --> 00:02:35.400] They installed trackers on the websites of several U.S.
[00:02:35.400 --> 00:02:41.400] hospitals, including prominent hospitals like Johns Hopkins, which they use to gather people's medical data.
[00:02:41.400 --> 00:02:45.080] And you can see those trackers sending back this is the ailment that they looked for.
[00:02:45.080 --> 00:02:46.360] They were looking for an appointment.
[00:02:46.360 --> 00:02:48.840] We can see that sort of information going back.
[00:02:48.840 --> 00:03:05.800] They also build so-called shadow profiles of non-Facebook users, where they gather information from people who use Facebook and from around the web in order to build a profile of people who aren't even on their platform in order to still be able to sell ads to those people.
[00:03:06.440 --> 00:03:13.000] And many of these things that they do are illegal, and Facebook is hit with scandal after scandal for these kind of privacy violations.
[00:03:13.000 --> 00:03:20.200] And each time they settle out of court and pay a fine, which is far smaller than the amount of money they made by committing the act in the first place.
[00:03:20.200 --> 00:03:32.760] Yeah, and then they go on Rogan and complain that they're being unfairly targeted by the EU who are just trying to crush them and how, yeah, specifically the American government should be easy on them in terms of regulation because it's patriotic to support an American company.
[00:03:32.760 --> 00:03:38.840] It's much like letting a bank robber settle with a thousand pound fine so they don't have to go to trial for their million pound heist.
[00:03:38.840 --> 00:03:41.160] You know, that's very much what Facebook do.
[00:03:41.160 --> 00:03:43.880] But that's not grounds that I'm looking to retread today.
[00:03:43.880 --> 00:03:51.200] In fact, I'm not even going to talk about Facebook's latest thing, which is using Facebook and Instagram to steal your private photos.
[00:03:51.200 --> 00:03:58.320] Which is to say, photos you didn't upload to Facebook or Instagram, they just take them off your phone and use them to train their AI.
[00:03:58.320 --> 00:04:01.520] So you have the app installed, it has permission to see your camera roll.
[00:04:01.520 --> 00:04:05.360] You didn't upload those pictures, but it'll take them and use them for their AI anyway.
[00:04:05.360 --> 00:04:10.320] I don't have the Facebook app installed, even if I rarely use Facebook, it's not installed anymore.
[00:04:10.320 --> 00:04:20.880] I'm not even going to talk about the news reports where Facebook's AI on WhatsApp gave the personal phone number of another WhatsApp user out in response to a query about railway timetables.
[00:04:20.880 --> 00:04:23.680] Somebody said, Where can I speak to this railway company?
[00:04:23.680 --> 00:04:29.280] And they sent him the number of a completely unrelated guy who was also a WhatsApp user.
[00:04:29.280 --> 00:04:29.760] Right.
[00:04:29.760 --> 00:04:30.800] I'm not going to talk about that either.
[00:04:31.040 --> 00:04:32.320] We can do a big deep dive on that.
[00:04:32.320 --> 00:04:33.680] That's not where we're going to go.
[00:04:33.680 --> 00:04:40.560] We have talked in the past about the persistent conspiracy theory that your mobile phone is listening to everything you say.
[00:04:40.880 --> 00:04:49.360] And companies like Facebook or Google or Apple or Amazon or others use these recordings to add into their advertising data.
[00:04:49.360 --> 00:04:55.840] And this is an incredibly pervasive idea to the point where people aren't even shocked at the suggestion.
[00:04:55.840 --> 00:04:57.120] It's kind of taken for granted.
[00:04:57.120 --> 00:04:58.560] Yeah, yeah, just assumes it's true.
[00:04:58.560 --> 00:04:59.680] This is true.
[00:04:59.680 --> 00:05:12.320] It's not uncommon at the moment to see videos on TikTok or YouTube or Instagram of people, it's frequently young women, deliberately whispering things to their boyfriend's phone to try and make it show him ads for that thing.
[00:05:12.320 --> 00:05:22.560] So she'll recall you'll see women doing little skits or sketches for their TikTok where they're whispering, engagement ring, holiday in Marbea, to their boyfriend's phone.
[00:05:22.560 --> 00:05:26.960] Which is a gag, but a gag based on the shared apparent understanding that this is a real thing that happens.
[00:05:26.960 --> 00:05:32.600] It is just a little comedy bit, but for the bit to work, you have to accept that it's true, as you say.
[00:05:33.560 --> 00:05:38.920] And historically on this show, we have said that this sort of surveillance doesn't exist.
[00:05:38.920 --> 00:05:45.000] There is no widespread effort to covertly record everything you say and upload it to Facebook or Google or whatever.
[00:05:45.320 --> 00:05:48.280] But maybe I'm going to have to eat my words on that.
[00:05:48.280 --> 00:05:54.360] Maybe I'm going to have to chow down on some humble pie and then record myself doing it and upload it to Instagram.
[00:05:54.360 --> 00:06:00.600] Because a company called the Cox Media Group has claimed that they are doing exactly that.
[00:06:00.920 --> 00:06:14.600] So from their website, they say a smart device is likely within earshot when we talk about our plans for the weekend, how badly we need our kitchen remodeled, or debate which SUV is best for the family with our spouse.
[00:06:14.600 --> 00:06:16.360] None of them don't get an SUV.
[00:06:16.360 --> 00:06:18.280] You are destroying the planet.
[00:06:18.280 --> 00:06:26.680] When small businesses know who needs them, they can target ads with enhanced accuracy, waste less money, and grow their audience.
[00:06:26.680 --> 00:06:28.280] Creepy question mark?
[00:06:28.280 --> 00:06:29.000] Sure.
[00:06:29.000 --> 00:06:30.520] Great for marketing?
[00:06:30.520 --> 00:06:31.960] Definitely.
[00:06:32.280 --> 00:06:33.080] That's the pitch.
[00:06:33.240 --> 00:06:35.000] Okay, so they're selling me as a service.
[00:06:35.160 --> 00:06:37.560] They say, we do this, we're unashamed about it.
[00:06:37.560 --> 00:06:38.760] We will do this for you.
[00:06:38.760 --> 00:06:43.800] I feel like we just accept advertising as an excuse for everything.
[00:06:43.800 --> 00:06:44.120] Yeah.
[00:06:44.120 --> 00:06:51.880] That, like, we've never really challenged the fact that, like, we challenge that there's too much advertising, that advertising is an issue, that capitalism is an issue, blah, blah.
[00:06:52.120 --> 00:06:56.360] But we haven't challenged the existence of advertising, full stop.
[00:06:56.360 --> 00:06:59.480] We've just accepted that as the start point.
[00:06:59.480 --> 00:07:02.520] And then everything else we complain about is the stuff beyond that.
[00:07:02.520 --> 00:07:05.320] And the intrusions of adverts, obviously, I talked about this on the last show.
[00:07:05.320 --> 00:07:07.480] The intrusion of adverts bother me so much.
[00:07:07.480 --> 00:07:17.520] And the thing that amazes me is how adverts of a different genre bypass any of your flags of, oh, this is a bad thing, or here are the rules that we have.
[00:07:17.520 --> 00:07:21.920] What struck me was Gary Lineker has a podcast, a very big podcaster, empire all about football.
[00:07:14.760 --> 00:07:24.160] He, Alan Shearer, will be on there.
[00:07:24.160 --> 00:07:28.000] Because it's a podcast, they'll get podcast advertising where they'll do horse red ads.
[00:07:28.000 --> 00:07:34.160] And Gary Lineker and Alan Shearer will read out the ads for fucking whatever meal preparation program.
[00:07:34.160 --> 00:07:38.560] And it strikes me that Alan Shearer used to do ads for McDonald's and Gillette.
[00:07:38.560 --> 00:07:44.000] And if you wanted Alan Shearer in your ad, I imagine it would cost you hundreds of thousands of pounds.
[00:07:44.000 --> 00:07:45.760] You're one of the famous footballers in the world at the time.
[00:07:46.000 --> 00:07:50.240] It's six or seven figures to have Alan Shearer or indeed Gary Lineker do your ads.
[00:07:50.480 --> 00:07:50.800] Exactly.
[00:07:50.800 --> 00:07:53.840] I mean, Nish Kumar does ad reads on Podsa of the UK.
[00:07:53.840 --> 00:08:02.240] To get Nish Kumar in an ad is going to be thousands and thousands of pounds, but hand him copy, hand these guys copy because they happen to be sat with a microphone.
[00:08:02.240 --> 00:08:05.440] And the convention of podcast advertising is, well, the horse read out ads.
[00:08:05.680 --> 00:08:06.640] And so we'll just do that.
[00:08:06.800 --> 00:08:11.360] We won't negotiate an extra price because of how big a celebrity the blue person is.
[00:08:11.360 --> 00:08:17.520] They just put their whole it's wild to me that people's that the advertising world has shifted in that kind of way and nobody's noticed it.
[00:08:17.520 --> 00:08:19.520] But we don't question anything when it comes to advertising.
[00:08:19.520 --> 00:08:21.600] Advertising is just an excuse for anything.
[00:08:21.600 --> 00:08:25.440] Like, well, like you say, oh, well, is it creepy?
[00:08:25.440 --> 00:08:27.280] Yes, but it's good for marketing.
[00:08:27.360 --> 00:08:29.440] It's great for marketing, so it's good for a time.
[00:08:29.520 --> 00:08:32.000] There's a trend at the moment that's been ongoing for a while.
[00:08:32.000 --> 00:08:34.720] I had a whine about this on Blue Sky recently.
[00:08:34.720 --> 00:08:37.280] It's a model that we refer to as consent or pay.
[00:08:37.280 --> 00:08:38.000] Oh, God, I hate it.
[00:08:38.800 --> 00:08:39.920] I hate it so much.
[00:08:39.920 --> 00:08:45.120] A lot of newspapers are doing this now where it's like you can pay for a subscription or you can have the ads.
[00:08:45.120 --> 00:08:46.240] And, you know, it's up to you.
[00:08:46.240 --> 00:08:47.120] It's up to you which one.
[00:08:47.120 --> 00:08:47.680] It's your choice.
[00:08:47.680 --> 00:08:50.000] The independent's got a thing saying it's your choice.
[00:08:50.000 --> 00:08:52.000] You can choose whichever one you like.
[00:08:52.000 --> 00:08:56.960] And fundamentally, what that comes down to is rich people get to preserve their privacy and poor people don't.
[00:08:57.520 --> 00:08:58.800] That's fundamentally what it comes down to.
[00:08:58.960 --> 00:09:00.680] I would happily accept the ads.
[00:09:00.680 --> 00:09:01.880] I don't want the tracking.
[00:08:59.680 --> 00:09:03.400] That's the problem is the issue.
[00:08:59.840 --> 00:09:04.280] Take those away.
[00:09:04.440 --> 00:09:08.200] Just serve me an ad and I will have that next to the article and I'll happily read that.
[00:09:08.200 --> 00:09:14.120] The way that ads has always worked in the advertising industry, in the newspaper industry, the entirety of the time.
[00:09:14.120 --> 00:09:20.520] But now it has to be, I need to know everything about you so I can send you a picture of things you're still not going to fucking buy, but I get to invade your privacy.
[00:09:20.520 --> 00:09:24.440] And in the last 20 years, ads and tracking have become synonymous.
[00:09:24.440 --> 00:09:28.680] The advertising industry can't seem to cope with the idea of an ad without tracking.
[00:09:29.000 --> 00:09:34.840] Whereas if you go to the 90s, 80s, 70s, no one was tracking whether you watch the ads on fucking Coronation Street or not.
[00:09:35.160 --> 00:09:38.040] No, you just put ads for Coronation Street watchers on Coronation Street.
[00:09:38.280 --> 00:09:38.920] Coronation Street.
[00:09:39.080 --> 00:09:39.800] Yeah, absolutely.
[00:09:39.800 --> 00:09:43.240] No one was tracking which billboard that you were looking at as you drove past it, that sort of thing.
[00:09:43.640 --> 00:09:49.000] And you do a focus group instead of like tracking a load of people and looking at those analytic data.
[00:09:49.000 --> 00:09:53.800] But now it's ads and tracking are synonymous and the industry can't imagine one without the other.
[00:09:53.800 --> 00:09:59.720] To the point where Cox Media Group, as you've said there, Alice, openly acknowledge that their product is creepy.
[00:09:59.720 --> 00:10:04.920] In fact, elsewhere on their website, they compare their product to an episode of Black Mirror.
[00:10:05.320 --> 00:10:08.360] They say, I know it sounds like something out of Black Mirror, but it's here today.
[00:10:08.360 --> 00:10:10.200] The thing is, that's also marketing.
[00:10:10.200 --> 00:10:11.080] That's their U.S.
[00:10:11.240 --> 00:10:12.200] Outrage marketing, yeah.
[00:10:12.200 --> 00:10:12.760] Yeah, exactly.
[00:10:12.760 --> 00:10:14.200] That's their USP.
[00:10:14.200 --> 00:10:20.200] It's that, oh, we will own the fact that this is so outrageous and try and get you to pass this around for that reason.
[00:10:20.200 --> 00:10:28.040] So CMG, Cox Media Group, their pitch listed Facebook, Amazon, and Google as technology partners for this product.
[00:10:28.040 --> 00:10:30.280] And they called this product active listening.
[00:10:31.240 --> 00:10:32.920] That was the product that they're pitching.
[00:10:32.920 --> 00:10:34.680] That's bad as well.
[00:10:34.840 --> 00:10:35.880] Well, yeah, because it's true.
[00:10:35.880 --> 00:10:38.680] Active listening is a therapy term.
[00:10:38.680 --> 00:10:39.480] Oh, I see.
[00:10:39.480 --> 00:10:41.560] Or a coaching term, whatever you want to call it.
[00:10:41.640 --> 00:10:45.840] Yeah, it's when you don't just wait for your turn to speak in a conversation.
[00:10:45.840 --> 00:10:49.520] You're sitting there and listening and engaging with and trying to take on board.
[00:10:44.920 --> 00:10:52.080] Yeah, but it's remarkable how many people don't.
[00:10:52.640 --> 00:10:55.360] You don't have to change the name of it, just get people to do it.
[00:10:55.680 --> 00:11:03.600] So the technology website 404 Media last year published a pitch deck which was used by CMG to advertise their active listening product.
[00:11:03.760 --> 00:11:07.760] Somebody leaked their sales pitch deck to 404.
[00:11:07.760 --> 00:11:16.400] And the pitch deck included claims that they use AI to gather data from hundreds of sources, including Facebook, Amazon, Google, and LinkedIn.
[00:11:16.400 --> 00:11:24.080] And they aggregate that together with behavioral data and voice data to target consumers who are in the market for your product.
[00:11:24.080 --> 00:11:27.120] Okay, that's a different claim to what they said.
[00:11:27.360 --> 00:11:30.720] That sounds like a different claim from what they were saying at the start.
[00:11:30.720 --> 00:11:33.280] Where they're saying that it's all about the voice data.
[00:11:33.280 --> 00:11:34.000] It's active listening.
[00:11:34.400 --> 00:11:38.800] Because even voice data, to say it's voice data, isn't quite the same as constant surveillance.
[00:11:38.800 --> 00:11:39.120] Yeah.
[00:11:39.760 --> 00:11:42.400] They also appear to be hyper-localized.
[00:11:42.560 --> 00:11:45.360] The pitch deck mentions two packages that they sell.
[00:11:45.360 --> 00:11:52.720] One targets consumers who are within 10 miles of your business, and the other targets consumers within 20 miles of your business.
[00:11:52.720 --> 00:11:54.960] Yeah, okay, that's understandable how they could do that.
[00:11:54.960 --> 00:12:02.560] It says, this is a quote from the pitch deck: smart devices capture real-time intent data by listening to our conversations.
[00:12:02.560 --> 00:12:08.320] Advertisers pair this voice data with behavioral data to target in-market consumers.
[00:12:08.320 --> 00:12:13.040] In-market consumers, presumably meaning consumers who are in the market for your product.
[00:12:13.360 --> 00:12:20.160] So somebody's walking nearby and says, oh, I could just have an ice cream, and you'd suddenly get a notification to the ice cream shop that's around the corner.
[00:12:20.160 --> 00:12:21.920] Yeah, that sort of thing.
[00:12:22.240 --> 00:12:25.200] So there's little ambiguity in this.
[00:12:25.200 --> 00:12:33.960] CMG appears to be confirming this long-standing conspiracy theory, and all those women on TikTok will be getting their engagement rings soon, presumably.
[00:12:34.520 --> 00:12:41.640] But as long as it's within like 10 blocks of 10 blocks of a jeweler who has bought the active listening products.
[00:12:41.640 --> 00:12:43.480] Yeah, they're going to get a really shit engagement ring.
[00:12:43.800 --> 00:12:45.960] A really cheap engagement ring.
[00:12:45.960 --> 00:12:47.640] So I'm still skeptical on this.
[00:12:47.640 --> 00:12:48.200] Yeah.
[00:12:48.200 --> 00:12:56.760] When CMG's claims for their active listening products first came to light, technology firms went out of their way to distance themselves from CMG.
[00:12:56.760 --> 00:13:01.400] Google said, we used to work with CMG, but we've removed them from our partner program.
[00:13:01.400 --> 00:13:05.960] Amazon Flat denied that CMG ever had anything to do with them.
[00:13:05.960 --> 00:13:12.600] And Facebook, of course, did neither of these things and just said that their relationship with CMG is currently under review.
[00:13:12.600 --> 00:13:14.280] So fully non-committal.
[00:13:14.280 --> 00:13:16.680] Oh, they're just going to wait for this to blow over and see what happens.
[00:13:16.920 --> 00:13:18.440] I think Amazon have just lied, though.
[00:13:18.440 --> 00:13:22.120] So at least Facebook have said nothing, but Amazon have lied.
[00:13:22.120 --> 00:13:26.200] But also, like, this is how you get the creep into things like that.
[00:13:26.200 --> 00:13:31.240] I don't want to sound like a conspiracy theorist here, but you throw somebody under the bus and go, oh, people are going to be outraged.
[00:13:31.240 --> 00:13:36.120] And now, the next time we mention it, people stop noticing that it's outrageous because they've heard it before.
[00:13:36.440 --> 00:13:40.120] And now we can creep this stuff through because it actually would be useful.
[00:13:40.440 --> 00:13:48.520] So the incident attracted specific statements from Apple and Google denying that what CMG was claiming was even possible.
[00:13:48.520 --> 00:13:54.920] So Google told Variety, Android prevents apps from collecting audio data when they're not being used.
[00:13:55.480 --> 00:14:02.040] And Apple similarly said, no app can access iPhones, microphone, or camera without your permission.
[00:14:02.040 --> 00:14:02.600] Yes.
[00:14:02.600 --> 00:14:10.520] Apple then went further and said any data collected for Siri is not used to build a marketing profile and we never sell it to anybody.
[00:14:10.520 --> 00:14:22.800] That thing of no app can collect microphone data without your permission is a fucking cop-out because how many apps, A, how many apps that don't need your microphone ask for microphone permissions just because they ask for fucking everything?
[00:14:22.800 --> 00:14:26.960] And B, how many apps do you give permission to use the microphone?
[00:14:26.960 --> 00:14:40.160] Because you're assuming that they'll use it for when you're using the app and when you're specifically recording microphone or using a call or whatever you're using, it would reasonably use the microphone for in the app.
[00:14:40.160 --> 00:15:00.240] And that's my assumption or my guess as to where this is going is that the apps that you've given access to the microphone for, for example, you're putting videos up on Instagram, that gives you access to your voice data that could then be analyzed in any kind of way and then lined up with who you are from the other data that's available.
[00:15:00.240 --> 00:15:06.320] And so they can say, oh, we take all this data and voice data, but it's the voice data from the stuff you've willingly given them.
[00:15:06.320 --> 00:15:25.200] But, you know, it's not like you can just remove microphone privileges from apps because A, you cannot use a mobile phone without allowing some app to use your microphone because it's a fucking phone, but also there are a lot of apps that don't need your microphone that you still can only use if you give microphone privileges to, right?
[00:15:25.200 --> 00:15:26.880] Yeah, delete those ones.
[00:15:27.520 --> 00:15:28.320] So what's going on?
[00:15:28.320 --> 00:15:28.960] What's the story?
[00:15:29.280 --> 00:15:31.680] How are CMG saying we have this product?
[00:15:31.680 --> 00:15:39.840] But Apple and Google, whose software powers almost every phone on the planet, both say what CMG are claiming is not possible.
[00:15:40.480 --> 00:15:44.400] So most smartphones these days come with voice assistant software.
[00:15:44.400 --> 00:15:50.960] And that software is increasingly incorporating AI to assist in the comprehension of the user's request.
[00:15:50.960 --> 00:15:53.920] It's not just smartphones that do this either.
[00:15:53.920 --> 00:15:56.720] So, some televisions have got voice mode now.
[00:15:56.720 --> 00:16:00.600] There's obviously Amazon's line of echo products that they use, the Aletta products.
[00:15:59.600 --> 00:16:02.760] Never have it in my house.
[00:16:02.920 --> 00:16:08.040] Apple has got the HomePod, which is a similar kind of smart speaker product and so on.
[00:16:08.040 --> 00:16:10.200] But they all work in a very similar way.
[00:16:10.200 --> 00:16:13.800] This isn't related to audio, like to microphone stuff at all.
[00:16:13.800 --> 00:16:25.240] But I was writing an email on my laptop the other day, and I never use any of the AI services unless I accidentally click on it when I'm trying to click something else and then immediately close it.
[00:16:25.240 --> 00:16:26.520] Like, I never use any of that.
[00:16:26.520 --> 00:16:34.040] But I was working on this email, and I was on a call at the same time, so I kept pausing to listen to the call and then carrying on with me more.
[00:16:34.040 --> 00:16:37.000] So it was taking me a while to write this email because I was doing it in short bursts.
[00:16:37.000 --> 00:16:43.080] And after a couple of minutes of this, I got a pop-up on the side of my computer that said, Do you need help writing this email?
[00:16:43.080 --> 00:16:47.000] I was like, I do not have, I don't use AI on my laptop.
[00:16:47.000 --> 00:16:50.520] I'm using a web browser email service.
[00:16:50.760 --> 00:16:55.960] Like, you should not be looking at that and seeing that it's taking me a while to write an email and be offering me helping.
[00:16:56.040 --> 00:16:57.560] And then trying to quote a group help.
[00:16:57.960 --> 00:16:59.320] This is ridiculous.
[00:16:59.480 --> 00:17:00.520] It's so infuriating.
[00:17:00.520 --> 00:17:10.360] I mean, so with the math projects that I work on through Good Thinking, the Parallel Academy stuff, we have quite a few Google subscriptions because we have to have a Gmail inbox and stuff.
[00:17:10.360 --> 00:17:19.000] And recently we got an email from Google saying, We're putting up the cost of your prescription, your subscription, because each month you're now getting access to powerful AI tools.
[00:17:19.320 --> 00:17:20.760] Well, I don't want those tools.
[00:17:20.760 --> 00:17:24.520] You're just charging me for now for stuff that I don't want that I don't think should exist.
[00:17:24.520 --> 00:17:25.160] Yeah.
[00:17:25.160 --> 00:17:39.880] And Google recently announced that I was going to integrate this with the story and didn't, but Google recently announced that their Gemini AI is going to be enabled by default on your Android phone and will have access to your WhatsApp messages when it does that.
[00:17:39.880 --> 00:17:41.400] Jesus Christ.
[00:17:41.400 --> 00:17:42.280] So, yeah.
[00:17:42.600 --> 00:17:46.880] Anyway, these voice assistant services all work in roughly the same way.
[00:17:47.040 --> 00:17:53.120] So these devices are always listening, they are always monitoring, and they are listening for a specific phrase.
[00:17:53.120 --> 00:17:53.440] Yes.
[00:17:53.600 --> 00:17:59.680] And this phrase you might refer to as the wake word, which tells the device you have a query or an instruction for it.
[00:17:59.680 --> 00:18:02.640] So on Apple devices, that's Siri or Hey Siri.
[00:18:02.640 --> 00:18:04.480] On Amazon Echo, it's Alexa.
[00:18:04.480 --> 00:18:06.240] For Google, it's OK Google.
[00:18:06.240 --> 00:18:08.720] Everyone's phone is just lit up listening to this now.
[00:18:09.360 --> 00:18:24.960] And when the device hears the wake word, it records a short section of audio and sends that over the internet to a machine which will interpret what you have said, returns that interpretation to your device, and your device can then act on the text.
[00:18:24.960 --> 00:18:32.000] The request, saying, oh, we asked what time it is, or to set a reminder, or to read notifications, or whatever.
[00:18:33.280 --> 00:18:40.000] But it would be misleading to say that your device is always listening in any meaningful sense.
[00:18:40.000 --> 00:18:44.160] Because for one thing, the devices are only listening for the trigger word, the wake word.
[00:18:44.160 --> 00:18:46.240] They aren't recording everything you say.
[00:18:46.240 --> 00:18:49.760] They're looking for a specific acoustic pattern.
[00:18:49.760 --> 00:19:38.040] And only when they detect that pattern will they it's like it's like characterizing your smoke detector as always sniffing right yeah it kind of doesn't actually it that's not a reasonable characterization of what it's doing and all of that takes place on the device listening for the wake word is something that the device does on its own and only when it hears it starts recording and then sends that recording off device for processing yeah and the thing is, it's like when we talked about this the last time that we touched on this topic, we know that the phone isn't constantly sending everything to the server because the battery power that that would take, your battery would not last if it's if it's yeah not just recording everything, but also transmitting that, you'd notice that as you're walking around your data, if you had any kind of limited data plan, you would notice that you're constantly doing that.
[00:19:38.040 --> 00:19:39.960] So it can't possibly be happening.
[00:19:39.960 --> 00:19:42.360] And we will come back to that point in a minute.
[00:19:42.360 --> 00:19:43.000] Okay.
[00:19:43.000 --> 00:19:46.760] So your phone isn't listening in the sense of having any comprehension of the words you're saying.
[00:19:46.760 --> 00:19:49.160] It's just pattern matching for the trigger phrase.
[00:19:49.160 --> 00:19:59.000] Now, it's a fair criticism to say, yeah, Mike, that's fine, but you only say that's what's happening because that's what they've told you is what's happening and they could be lying about it.
[00:19:59.000 --> 00:19:59.800] And maybe that's true.
[00:19:59.800 --> 00:20:02.760] Maybe it doesn't work like this at all and they're covering it up.
[00:20:02.760 --> 00:20:04.520] So how would we know?
[00:20:04.840 --> 00:20:18.440] So back in 2019, a group of researchers working at a cybersecurity firm called Wondera conducted a controlled experiment to see if they could find evidence of apps on your phone covertly recording conversations.
[00:20:18.760 --> 00:20:22.840] Now, before I describe the experiment that they did, this was not peer-reviewed.
[00:20:22.840 --> 00:20:25.000] It was only published on their website.
[00:20:25.000 --> 00:20:29.000] So with that caveat, use as much salt as you need in order to do this.
[00:20:29.000 --> 00:20:29.560] Personally, I'm not sure.
[00:20:29.800 --> 00:20:37.080] This is industry stuff rather than cybersecurity stuff.
[00:20:38.360 --> 00:20:41.880] So they took two smartphones, an Apple device and a Samsung device.
[00:20:41.880 --> 00:20:44.360] And the Samsung device was running Android.
[00:20:44.360 --> 00:20:47.560] And they put those phones in a room for three days.
[00:20:47.560 --> 00:20:48.200] Okay.
[00:20:48.200 --> 00:20:56.120] Both those devices had Facebook installed, Instagram installed, Chrome installed, Snapchat, YouTube, and Amazon.
[00:20:56.120 --> 00:20:57.240] This is pre-TikTok.
[00:20:58.200 --> 00:21:00.120] So this was before TikTok was huge.
[00:21:00.120 --> 00:21:00.760] Yeah.
[00:21:00.760 --> 00:21:04.920] Those apps were given full permission to access the microphone on the device.
[00:21:04.920 --> 00:21:07.880] So no restriction, no, only when the app is running.
[00:21:08.120 --> 00:21:10.680] Full permission on the microphone.
[00:21:11.000 --> 00:21:17.600] Every day for three days, they played a pet food ad into the room for half an hour.
[00:21:14.760 --> 00:21:18.960] And then it would stop.
[00:21:19.280 --> 00:21:25.200] As a control, there was also a quiet room where the devices were left in silence for three days.
[00:21:26.720 --> 00:21:30.560] In both cases, the devices were left switched on and locked.
[00:21:30.560 --> 00:21:32.480] So on the lock screen, switched on.
[00:21:32.480 --> 00:21:33.680] As you carry them around, yeah.
[00:21:33.680 --> 00:21:35.840] And who had the keys to the room?
[00:21:35.840 --> 00:21:39.040] That wasn't in the description that they put out there.
[00:21:39.040 --> 00:21:48.160] I can imagine a world where someone goes, oh, but like you're having a workplace romance and that room's locked and just quiet for three days.
[00:21:48.560 --> 00:21:52.160] You know, we can go in there and now you're talking to the phones.
[00:21:52.480 --> 00:21:53.200] So after the experiment.
[00:21:54.000 --> 00:21:55.760] Talking is what you're doing in there.
[00:21:56.400 --> 00:21:57.120] After the experiment.
[00:21:57.200 --> 00:21:59.920] Why is it bringing a batch for Squelch?
[00:21:59.920 --> 00:22:02.400] How is Squelch a product?
[00:22:03.360 --> 00:22:05.440] After the experiment, they measured three things.
[00:22:05.440 --> 00:22:09.440] One, how much battery was used in the pet food room versus the quiet room.
[00:22:10.000 --> 00:22:13.920] Two, how much data was used in the pet food room versus the quiet room.
[00:22:13.920 --> 00:22:17.840] And three, did any ads for pet food show up on the devices?
[00:22:18.800 --> 00:22:26.320] And what they found was that for both Android and Apple devices, no fucking difference in data use between the two rooms at all.
[00:22:27.040 --> 00:22:34.480] The data that was used was minimal, less than a megabyte, and in most cases, significantly less than a megabyte.
[00:22:34.480 --> 00:22:42.080] They had a control which was actually saying, hey, Siri, or okay, Google to these things to see how that's my phone's just lit up when I've said that.
[00:22:42.480 --> 00:22:46.080] Or saying okay, Google to one of these things to see how much data that used.
[00:22:46.080 --> 00:22:46.800] And it was huge.
[00:22:46.800 --> 00:22:49.600] It was like 30 megabytes, 100 megabytes.
[00:22:49.600 --> 00:23:02.280] So, that's a slightly unfair characterization because what they did was assume that a recording of the full 30 minutes would use the same amount of data per second as a triggered wake word.
[00:22:59.840 --> 00:23:06.600] And so they extrapolated how much it would expect to be over that time.
[00:23:06.920 --> 00:23:12.040] But yeah, there was minimal data use and no difference between the quiet room and the pet food room.
[00:23:12.040 --> 00:23:13.320] So that's a pretty good test.
[00:23:13.320 --> 00:23:20.840] They measured the data use of the phones by connecting them to their Wi-Fi so they could see whether the phones were making requests out.
[00:23:20.840 --> 00:23:27.720] But of course, maybe the phones are smart about it and they're actually uploading the data using the 5G connection or something like that.
[00:23:27.880 --> 00:23:30.680] Say, oh, when we're spying on people, don't use their Wi-Fi.
[00:23:30.680 --> 00:23:31.560] They'll rumble.
[00:23:31.560 --> 00:23:33.160] Use the cellular modem instead.
[00:23:33.160 --> 00:23:36.520] I think you'd be more likely to notice that if you had any kind of limited data plan.
[00:23:36.520 --> 00:23:41.480] But also, there was no difference in battery use between the silent room and the pet food room.
[00:23:41.480 --> 00:23:46.200] If they were uploading masses of audio data using the cellular modem, that's going to drain the battery.
[00:23:46.200 --> 00:23:47.080] Were they brand new phones?
[00:23:47.080 --> 00:23:47.800] Sorry, did you say that?
[00:23:47.800 --> 00:23:48.040] Sorry?
[00:23:48.120 --> 00:23:48.840] They were brand new phones.
[00:23:48.840 --> 00:23:50.440] Yeah, they were new phones.
[00:23:50.440 --> 00:23:53.160] So actually, there was no difference between the rooms at all.
[00:23:53.160 --> 00:24:01.480] And neither of the devices started showing ads for pet food after the test, either in apps or on the web.
[00:24:01.800 --> 00:24:03.560] Now, as I say, that's not peer-reviewed.
[00:24:03.560 --> 00:24:05.480] It's very much a back-of-the-envelope experiment.
[00:24:05.480 --> 00:24:11.400] It's the kind of thing we could probably put together at MSS, is the kind of test that we might do, like we did with the shoesy band.
[00:24:11.400 --> 00:24:12.840] Yeah, we just couldn't afford the phones.
[00:24:13.480 --> 00:24:16.920] But nevertheless, I found the findings compelling on this.
[00:24:16.920 --> 00:24:26.840] They found no evidence of any of those apps covertly recording using the microphone, even when the microphone was granted full permission on those apps.
[00:24:27.160 --> 00:24:29.480] Now, admittedly, this is from 2019.
[00:24:29.480 --> 00:24:30.920] This is pre-TikTok.
[00:24:30.920 --> 00:24:35.240] CMG have only started promoting their active listening products over the last couple of years.
[00:24:35.480 --> 00:24:41.400] It's possible that Facebook have become worse, or Google have changed their practices, or Amazon have started doing stuff.
[00:24:41.800 --> 00:24:43.320] So maybe things have changed.
[00:24:43.320 --> 00:24:46.160] Maybe they have found a way to do this these days.
[00:24:46.160 --> 00:24:53.600] Now, one possible way you could actually do this is by exploiting the way the pattern matching for the wake word works.
[00:24:53.600 --> 00:25:03.040] So sometimes you say something that isn't the wake word, like, oh, hey, seriously, or okay, cool, or something like that.
[00:25:03.040 --> 00:25:05.040] That the device.
[00:25:05.040 --> 00:25:05.840] Yes.
[00:25:05.840 --> 00:25:14.160] That the device mistakes for the wake word and then will record a section of conversation and upload that to the cloud.
[00:25:14.160 --> 00:25:20.800] It's also possible that when you're making a legitimate request to one of these tools, someone else is having a conversation at the same time in the background.
[00:25:21.120 --> 00:25:24.800] And that then gets recorded and uploaded and ingested into these systems.
[00:25:24.800 --> 00:25:34.000] So there are plausible mechanisms where unexpected segments of your conversation could end up being sent to Google or Amazon or Apple or whatever.
[00:25:34.000 --> 00:25:41.840] But it doesn't seem like an efficient way to run a business analyzing those snippets.
[00:25:41.840 --> 00:25:43.040] Absolutely not.
[00:25:43.040 --> 00:25:56.560] So in a study published in the journal Computer Speech and Language in 2022, researchers found that devices are accidentally triggered like this on average about once an hour by a TV show or a news broadcast or whatever.
[00:25:56.560 --> 00:26:04.960] And in around half of those cases, a significant amount of audio recording, which is over 10 seconds of audio, is uploaded for processing.
[00:26:04.960 --> 00:26:06.320] And that could contain anything.
[00:26:06.560 --> 00:26:06.880] Right.
[00:26:06.880 --> 00:26:11.920] And that's what about 240 minutes if it happens every single hour over the 24-hour period.
[00:26:11.920 --> 00:26:17.120] So it's entirely plausible that Amazon, Google, Apple, or whatever are taking these accidental quotes.
[00:26:17.200 --> 00:26:18.640] Sorry, 240 seconds, not minutes.
[00:26:18.640 --> 00:26:18.960] Sorry.
[00:26:18.960 --> 00:26:19.520] Before I open it.
[00:26:18.920 --> 00:26:20.640] Indeed, before anyone mentioned it.
[00:26:20.640 --> 00:26:26.720] And they could be taking that and saying, well, the user sent this this by mistake, but let's feed it into the ad model anyway.
[00:26:27.040 --> 00:26:28.080] You know, why not?
[00:26:28.080 --> 00:26:30.040] Facebook could be doing the same thing.
[00:26:30.040 --> 00:26:35.480] They don't have their own voice assistant software, but they could take the audio from videos you upload to your Facebook.
[00:26:29.680 --> 00:26:36.200] Like I'm saying, yeah.
[00:26:36.520 --> 00:26:41.880] Or if you just open the camera app, or if you send someone a voice memo or whatever, they could listen to all of these things.
[00:26:41.880 --> 00:26:48.680] But also, like, when they can analyze the data from when you actually say to your phone, okay, Google, show me the nearest ice cream shop.
[00:26:48.920 --> 00:26:55.160] Indeed, like, yeah, there are times when you are giving them the stuff that they can use to market at you.
[00:26:55.160 --> 00:26:58.920] So, all that said, these are technically plausible mechanisms.
[00:26:58.920 --> 00:27:07.720] There is no evidence that this is what was happening, and no evidence that Cox Media Group was using this technique for their active listening product.
[00:27:07.720 --> 00:27:11.800] And in fact, there's good reasons to think that they are not using these techniques.
[00:27:11.800 --> 00:27:14.520] So, first was the price of their product.
[00:27:14.520 --> 00:27:24.280] They were advertising their active listening product as $100 per day for the 10-mile radius product and $200 per day for the 20-mile radius product.
[00:27:24.600 --> 00:27:29.480] That's absurdly cheap for this sort of surveillance advertising product.
[00:27:29.480 --> 00:27:42.840] If it really worked that way, if it really works the way that they were advertising it as working, they would need to spend significantly more money on the technical infrastructure to make it work than they are getting from customers by selling them this product.
[00:27:42.840 --> 00:27:48.200] Because you're thinking from this location in a 10-mile radius, how many phones are in that space?
[00:27:48.200 --> 00:27:56.840] I would need to be crunching data on essentially all of those constantly or accessing something that they're constantly like they could say that they're not the ones doing that.
[00:27:56.840 --> 00:28:00.600] That there's another service out there that's automatically doing that, and they have access to that database.
[00:28:00.600 --> 00:28:05.320] But yeah, this sort of bulk audio processing is not cheap.
[00:28:05.640 --> 00:28:10.120] Second reason why I don't think this is happening was the lack of response from regulators.
[00:28:10.120 --> 00:28:26.960] When 404 Media blew the whistle on this, on what on what Cox Media Group were doing, there were raised eyebrows in the technical press and the software development and cyber security communities, and no appreciable response at all from regulators in the UK, the US, Australia, the EU, nothing.
[00:28:26.960 --> 00:28:28.720] I think the EU is the most significant one.
[00:28:28.880 --> 00:28:35.120] Yes, because there's no response from US regulators.
[00:28:35.440 --> 00:28:37.440] There's no response from US regulators to anything.
[00:28:37.760 --> 00:28:50.480] Well, oddly enough, the closest we got to a response from any regulator was a MAGA Republican asking questions about it and saying, well, Facebook's listening to our stuff and it's a conspiracy in Zuckerberg and et cetera, et cetera, et cetera.
[00:28:50.480 --> 00:28:54.560] And they wrote a message and said, I want an explanation for what's going on here.
[00:28:54.560 --> 00:28:55.920] And then it never went anywhere else.
[00:28:55.920 --> 00:29:00.080] Yeah, because they wrote a message and Mark Zuckerberg wrote a check and then it was all fire.
[00:29:01.360 --> 00:29:12.160] And if Cox Media Group were actually doing what they were doing, I would have expected a much stronger response or indeed any kind of response from regulators and there was none.
[00:29:12.160 --> 00:29:13.680] There was nothing.
[00:29:14.000 --> 00:29:20.160] But finally, and this is the biggest one, this is just not a practical way to do surveillance advertising.
[00:29:20.160 --> 00:29:26.480] Fetching, uploading, storing, processing huge amounts of audio data like this cost a fucking bomb.
[00:29:26.480 --> 00:29:27.120] Yeah.
[00:29:27.120 --> 00:29:31.840] It would cause a noticeable drain on people's battery on their devices.
[00:29:31.840 --> 00:29:37.120] It would burn through meter data plans on people's phones, which is still very common on mobile phone contracts to have.
[00:29:37.120 --> 00:29:40.000] I've got 60 gig of data, 30 gig of data, whatever.
[00:29:40.000 --> 00:29:49.480] It would burn through that, all while offering little to no advantage over all of the existing ways that they've got of doing surveillance advertising cheaper.
[00:29:49.680 --> 00:29:51.520] Yeah, if anything, it would muddy the waters.
[00:29:51.520 --> 00:29:56.240] It would muddy your data with so much noise that it could actually detract from your other methods that were better.
[00:29:56.240 --> 00:30:08.200] Yeah, so the methods they already have, tracking which websites you visit, which searches you do, your IP address, your location, what do you post on Facebook, what do you buy from Amazon, what do you post on Twitter?
[00:30:08.200 --> 00:30:22.280] The combination of the technical impracticality of this, the legal risk of this, and questionable economic benefit makes the systematic audio surveillance practically impossible to implement at any kind of scale.
[00:30:22.280 --> 00:30:27.400] There is no reason why you would do this because all of the other ways of doing it are massively cheaper.
[00:30:27.400 --> 00:30:28.280] Yeah.
[00:30:28.920 --> 00:30:41.400] The reason you might see an ad for a mattress right after talking to your partner about needing a new mattress is a combination of confirmation bias and the existing surveillance advertising mechanisms that are already in place.
[00:30:41.400 --> 00:30:41.640] Yeah.
[00:30:41.640 --> 00:30:44.840] Like maybe you've Googled for a mattress at some point.
[00:30:44.840 --> 00:30:46.360] You've searched Amazon for a mattress.
[00:30:46.520 --> 00:30:49.800] Or your friends sent you a Facebook message about how they just got a new mattress and it was great.
[00:30:49.800 --> 00:30:51.240] And you thought, oh, maybe it's time that I do.
[00:30:51.240 --> 00:30:54.120] But Facebook knows that you've sent you received that message.
[00:30:54.120 --> 00:30:54.440] Yeah.
[00:30:54.440 --> 00:31:04.920] It's going to be that or the fact that half the fucking internet is ad for mattresses these days or meal kits or VPNs or a really exciting new hose pipe, which is a very exciting new hose pipe.
[00:31:05.080 --> 00:31:07.400] Massively exciting news hose pipes.
[00:31:07.400 --> 00:31:13.000] Or it's Thomas Smith advertising insurance, I think, against his will because he did not do that.
[00:31:13.000 --> 00:31:16.440] So people may have heard an advert with Thomas Smith doing that on our part.
[00:31:16.520 --> 00:31:17.240] On the show, yeah.
[00:31:17.240 --> 00:31:20.120] And he is livid because he did not give them permission to use that audio.
[00:31:20.280 --> 00:31:22.600] Yeah, they just clipped it out of a hose.
[00:31:23.560 --> 00:31:24.680] By mistake, apparently.
[00:31:24.680 --> 00:31:26.040] By mistake, they did that.
[00:31:26.040 --> 00:31:26.840] It happens by accident.
[00:31:27.000 --> 00:31:30.040] You accidentally download Thomas's show and edit his ad out and then sell it.
[00:31:30.200 --> 00:31:31.080] And then put it in other shows.
[00:31:31.400 --> 00:31:31.880] Yeah.
[00:31:32.200 --> 00:31:41.160] So, if the data suggests this isn't happening, and we probably can tell it's not happening at least to any sort of scale, it's not likely to be happening at least to any sort of scale.
[00:31:41.160 --> 00:31:43.720] What the fuck was CMG selling to their customers?
[00:31:43.720 --> 00:31:46.560] Yeah, this is the question that was occurring to me as we were going through this.
[00:31:47.200 --> 00:31:49.680] They were just like, Why are they lying about this?
[00:31:49.920 --> 00:31:51.600] What's to gain from that?
[00:31:51.600 --> 00:31:57.040] So, CMG have now removed all references to the active listening products from their website.
[00:31:57.040 --> 00:32:00.240] They denied that they were ever using any audio recordings.
[00:32:00.240 --> 00:32:08.160] Their position is now that they just aggregate data from third parties and they do not use conversations recorded on people's devices.
[00:32:08.160 --> 00:32:14.720] No whistleblowers have come forward, or former employees have come forward with technical details about what CMG were doing.
[00:32:15.040 --> 00:32:21.920] There are no internal documents, technical specifications, no operational procedures have leaked beyond the pitch deck.
[00:32:21.920 --> 00:32:27.680] Google, Apple, and Amazon have all denied selling voice data to anybody, much less to CMG.
[00:32:27.680 --> 00:32:36.800] I suppose Google and Amazon could be feeding their own ad targeting platforms with this accidentally captured data around the wake word, etc.
[00:32:37.120 --> 00:32:43.920] So, they aren't selling the audio recordings, but maybe they're selling the insights gleaned from those audio recordings.
[00:32:43.920 --> 00:32:52.640] But there's no real evidence that that interpretation is true either, other than CMG's initial claims that those were insights that they had.
[00:32:52.960 --> 00:32:56.880] So, for my money, frankly, I think this was just marketing hyperbole.
[00:32:56.880 --> 00:33:02.480] I don't think there was any ever real legitimate technical capability to do this from CMG.
[00:33:02.480 --> 00:33:07.040] I might be proven wrong in time, but right now, I think, frankly, this was just CMG.
[00:33:07.040 --> 00:33:08.640] We're full of shit.
[00:33:12.480 --> 00:33:15.360] So, we had the Mercy Science Skeptic summer picnic.
[00:33:15.360 --> 00:33:15.920] We did.
[00:33:15.920 --> 00:33:16.880] It was lovely.
[00:33:17.120 --> 00:33:24.480] So, as Marsh mentioned on the last show, we've been in a heat wave, and the Saturday before the picnic was horrifically hot.
[00:33:24.640 --> 00:33:25.120] It was bad.
[00:33:25.600 --> 00:33:29.520] And luckily, the Sunday that we actually had the picnic on was really temperate.
[00:33:29.520 --> 00:33:30.000] It was lovely.
[00:33:30.200 --> 00:33:30.760] Yeah, it was lovely.
[00:33:30.760 --> 00:33:33.080] We had a lot of people turn up who we hadn't seen before.
[00:33:33.800 --> 00:33:35.640] People who might even come back, people listeners.
[00:33:35.640 --> 00:33:37.400] In fact, people who listen to the show who will be able to do it.
[00:33:37.480 --> 00:33:38.920] We've seen some listeners there, which was nice.
[00:33:39.240 --> 00:33:47.720] The board meeting we had before the picnic, we put contingency plans in place saying if the weather is really bad, then we might have to cancel a picnic or pull the picnic or whatever.
[00:33:47.720 --> 00:33:50.920] I didn't anticipate we might have to consider pulling it because it's too hot.
[00:33:51.640 --> 00:33:59.560] We did have to have some, we had a board member and an attendee bring emergency coverings like a gazebo and a thing.
[00:33:59.720 --> 00:34:01.080] We had a lot of ice.
[00:34:01.080 --> 00:34:08.920] But yeah, Phil went out to Costco and bought us a load of stuff and he bought, much to Alice's insistence more than anything, a nice big Costco cake with loads of beef.
[00:34:09.800 --> 00:34:10.520] Beautiful Costco cake.
[00:34:10.680 --> 00:34:11.720] I love a Costco cake.
[00:34:11.720 --> 00:34:24.760] It's a really nice cake with a really nice buttercream icing and not the fucking cream cheese icing that's everywhere these days, which I don't like, but like a proper frothy, fluffy buttercream icing.
[00:34:24.760 --> 00:34:25.720] I didn't like the icing.
[00:34:26.200 --> 00:34:27.000] I did not like the icing.
[00:34:27.240 --> 00:34:29.000] You had a piece that had far too much icing on it.
[00:34:29.320 --> 00:34:31.240] I want an icing to be slightly firmer than that.
[00:34:31.720 --> 00:34:33.560] It was more like a whipped cream than an icing for me.
[00:34:33.800 --> 00:34:48.440] It's a delicious cake, but because they're fucking massive, and I'm basically the only person I know who loves Costco cakes, I never get them because I can't eat an entire cake to myself more quickly than it will go off and stale and horrible.
[00:34:48.760 --> 00:34:51.240] And Costco will put a pattern on your cake as well.
[00:34:51.240 --> 00:34:52.440] They'll print a pattern on the top.
[00:34:52.840 --> 00:34:56.840] He stuck it on when we arrived, and it's why it was slightly off-centre.
[00:34:56.840 --> 00:35:01.480] Yeah, so he tried to put it on, and I refused to look at it because I said, if I look at that, it's going to piss me off.
[00:35:01.480 --> 00:35:02.120] Oh, it was off to Central Coke.
[00:35:02.200 --> 00:35:03.240] It wasn't on the wonk at all.
[00:35:03.800 --> 00:35:05.080] It was beautifully straight.
[00:35:05.560 --> 00:35:06.360] It just wasn't centered.
[00:35:06.520 --> 00:35:08.120] It wasn't perfectly centered.
[00:35:08.120 --> 00:35:09.880] It was a little bit off to one side.
[00:35:09.880 --> 00:35:11.320] So Phil came up with this plan.
[00:35:11.320 --> 00:35:15.840] He said, what we'll do is we'll start cutting the cake from the right-hand side.
[00:35:14.920 --> 00:35:17.840] And then at some point, it looks even.
[00:35:18.160 --> 00:35:23.520] And then eventually we cut enough cake and no one knows that it wasn't centered the whole time.
[00:35:23.520 --> 00:35:24.560] That's brilliant.
[00:35:24.560 --> 00:35:29.200] So when we came to cut the cake, what happened was we ignored that and started cutting it from the left-hand side.
[00:35:29.200 --> 00:35:30.400] Alice started cooking it from the corner.
[00:35:30.640 --> 00:35:40.720] What's especially impressive about this epic failure is that not only did I start cutting because I do automatically do things left to right.
[00:35:40.720 --> 00:35:41.840] I read left to right.
[00:35:41.840 --> 00:35:42.480] I do everything.
[00:35:42.480 --> 00:35:44.320] When I'm organizing things, I do it left to right.
[00:35:44.320 --> 00:35:46.320] When I put my contact lenses in, I go left to right.
[00:35:46.320 --> 00:35:47.680] Like, I do everything left to right.
[00:35:47.680 --> 00:35:56.240] So I just habitually went left to right, despite the fact that that required me to use my left hand when I'm right-handed to cut the cake.
[00:35:56.240 --> 00:36:00.960] And you have to move someone out of the way who has stood there because that's obviously that's where you wouldn't be cutting.
[00:36:00.960 --> 00:36:03.760] If you were to cut an aubergine, would you cut it left to right?
[00:36:04.560 --> 00:36:04.960] What?
[00:36:04.960 --> 00:36:05.520] No?
[00:36:06.000 --> 00:36:06.960] You said you did everything left.
[00:36:07.200 --> 00:36:07.520] You would do that.
[00:36:07.680 --> 00:36:09.040] You said you did everything left to right.
[00:36:09.040 --> 00:36:09.600] No, that's true.
[00:36:09.600 --> 00:36:10.000] I would do it.
[00:36:10.320 --> 00:36:12.720] You cut an aubergine, you'd hold it with your left hand and you'd cut from the right.
[00:36:12.800 --> 00:36:13.760] You could cut from the right, yeah.
[00:36:14.160 --> 00:36:15.120] And you were cutting.
[00:36:15.120 --> 00:36:16.240] This was using a knife.
[00:36:16.480 --> 00:36:18.880] This was much more like cutting an aubergine than writing something.
[00:36:20.160 --> 00:36:21.200] I don't know.
[00:36:21.200 --> 00:36:22.720] It just went wrong.
[00:36:22.720 --> 00:36:25.520] It ended up crooked because I was using my left hand.
[00:36:25.920 --> 00:36:28.080] I didn't want to be cutting the cake.
[00:36:28.080 --> 00:36:30.160] I said I waited.
[00:36:30.160 --> 00:36:36.160] I wasn't getting the cake started because I didn't want to be left with the responsibility of once you cut one piece of cake.
[00:36:36.160 --> 00:36:36.320] You've got to get a cut.
[00:36:36.480 --> 00:36:38.160] You've got to give everybody a cake cake.
[00:36:38.160 --> 00:36:39.760] And I didn't want to do that.
[00:36:39.760 --> 00:36:43.920] So I ended up doing something I didn't want to do and doing it badly.
[00:36:44.160 --> 00:36:45.600] Very, very badly.
[00:36:45.600 --> 00:36:48.880] So we had this cake with a beautiful MSS logo on the top.
[00:36:48.880 --> 00:36:51.200] And Alice, you cut yourself off a slice of cake.
[00:36:51.200 --> 00:36:54.480] You didn't want to start being the cake dispenser for everyone.
[00:36:54.640 --> 00:36:55.440] So you had to slice it.
[00:36:55.680 --> 00:36:56.160] But I did.
[00:36:56.160 --> 00:36:56.720] But I did.
[00:36:56.720 --> 00:36:57.240] I did.
[00:36:56.800 --> 00:36:58.160] You did a couple of slices.
[00:36:58.400 --> 00:36:59.480] Eight or nine slices of cake.
[00:36:59.640 --> 00:37:00.440] A big cake like that.
[00:36:59.120 --> 00:37:02.360] You do one stripe down the full length of it.
[00:37:02.360 --> 00:37:03.320] You cut that into pieces.
[00:36:59.280 --> 00:37:04.840] There's the cake that gets wiped.
[00:37:05.000 --> 00:37:07.640] And then someone else comes along when there's none left and does the next stripe.
[00:37:07.640 --> 00:37:08.040] Yes.
[00:37:08.200 --> 00:37:14.760] And then Warren, you know, very helpful went in and started cutting out slices of cake for people and making little piles of cake so people could come up.
[00:37:14.760 --> 00:37:15.960] And then he stopped.
[00:37:15.960 --> 00:37:21.960] And he stopped about a third of the way through the cake, having cut the M off the MSS logo.
[00:37:22.280 --> 00:37:24.600] So now we had a cake that just said SS.
[00:37:24.600 --> 00:37:26.120] And it was SS on a wonk as well.
[00:37:26.120 --> 00:37:30.280] It was a lot of five SS because there was loads of space either side with no rice paper.
[00:37:30.600 --> 00:37:33.560] And I'm sitting there bald with a skin edge.
[00:37:33.800 --> 00:37:36.200] I'm sitting there bald with a skin edge.
[00:37:36.440 --> 00:37:38.680] We've got a cake that says SS on it.
[00:37:38.680 --> 00:37:40.280] It was a really bad look.
[00:37:40.280 --> 00:37:40.680] It was.
[00:37:41.480 --> 00:37:42.840] We need to eat more cake.
[00:37:42.840 --> 00:37:43.080] Yep.
[00:37:43.080 --> 00:37:45.000] And we had a flag that looked like a rifle as well.
[00:37:45.000 --> 00:37:47.640] We did not have hands at all.
[00:37:52.440 --> 00:37:55.000] So for QED, tickets are still sold out.
[00:37:55.000 --> 00:38:00.840] They've not magically appeared more tickets, but you can get an online ticket to QED for ยฃ49.
[00:38:00.840 --> 00:38:03.080] You can do that by going to QEDcon.org.
[00:38:03.080 --> 00:38:11.160] And that gets you access to the live stream of the main stage, of the panel room, and of the live podcast room for the whole weekend.
[00:38:11.160 --> 00:38:14.280] And yes, you can go and buy that now at QEDcon.org.
[00:38:14.280 --> 00:38:14.680] You can.
[00:38:14.680 --> 00:38:16.040] It's a very straightforward process.
[00:38:16.360 --> 00:38:17.000] It'll be lots of fun.
[00:38:17.000 --> 00:38:21.720] You'll also find all the latest updates of what we're doing at QED for the final ever QED.
[00:38:21.960 --> 00:38:22.680] Final ever ed.
[00:38:22.840 --> 00:38:24.920] Be delighted if you came and took part.
[00:38:24.920 --> 00:38:28.200] Also, we should talk about our Patreon.
[00:38:28.200 --> 00:38:37.400] So if you like what we do, if you enjoy the show and you would like to support us, you can do that by visiting patreon.com forward slash skeptics with a K, where you can donate from as little as a pound a month.
[00:38:37.400 --> 00:38:40.600] And that gets you access to an ad-free version of this show.
[00:38:40.600 --> 00:38:43.400] We never listen to what you say when you're listening to the show.
[00:38:43.400 --> 00:38:45.680] Otherwise, you just get a lot of ads with swearing in them.
[00:38:46.880 --> 00:38:50.960] I imagine like we swear a lot, and you're probably going, Listen, these fucking wankers.
[00:38:44.920 --> 00:38:51.840] So don't listen.
[00:38:52.080 --> 00:38:54.480] We don't listen to your microphone when you listen to the show.
[00:38:54.480 --> 00:38:58.640] But yeah, you get an ad-free version of this show from as little as a pound a month.
[00:38:58.640 --> 00:39:05.200] And that's where you can support all the long hours that we put into actually writing and recording the show.
[00:39:05.200 --> 00:39:07.920] Especially in the fucking hot weather, Christ.
[00:39:07.920 --> 00:39:09.600] So, yes, you can do that.
[00:39:09.600 --> 00:39:12.640] And if you're not able to do that, of course, you can listen to the show with ads.
[00:39:12.640 --> 00:39:14.240] That also helps us out as well.
[00:39:14.240 --> 00:39:20.560] And the other thing that can help us out is if you leave a review for the show, that is a very helpful thing for you to do as well.
[00:39:20.560 --> 00:39:25.040] I found out what Apple's algorithm is for where you go in the iTunes charts.
[00:39:25.040 --> 00:39:25.680] Oh, really?
[00:39:25.680 --> 00:39:30.960] And it's the number of new subscribers per in the last seven days.
[00:39:31.280 --> 00:39:40.640] So as you put on, if you're if you get a lot of new subscribers all at once, that's when you go up the charts and you garner more subscribers and then go further up the charts and then garner more subscribers.
[00:39:40.640 --> 00:39:45.440] So listeners, what you need to do is go around, steal the phone of your friends and subscribe to the show.
[00:39:45.920 --> 00:39:46.480] 100%.
[00:39:46.720 --> 00:39:50.720] And that's the way to just go around that and increase our number of subscribers.
[00:39:51.040 --> 00:39:55.840] Buy four phones, put two in one room, two in a quiet room, subscribe them all to Skeptics with a K.
[00:39:55.840 --> 00:39:57.200] That's the way to do that.
[00:39:57.200 --> 00:40:04.000] You can also, if you want to support the work of the Mercy Tower Skeptic Society, you could do that at patron.com forward slash Mercy Skeptics.
[00:40:04.000 --> 00:40:08.560] That also is nice and cheap and gets you an ad-free version of this show.
[00:40:08.560 --> 00:40:09.600] I think that's it then.
[00:40:09.600 --> 00:40:10.560] Is that all we've got time for?
[00:40:10.560 --> 00:40:12.000] I think that's all we've got time for.
[00:40:12.000 --> 00:40:13.360] That's all I've got time for.
[00:40:13.680 --> 00:40:17.360] All that remains is for me to thank Alice for coming on today.
[00:40:17.360 --> 00:40:17.760] Thank you.
[00:40:17.760 --> 00:40:18.560] Thank you to Marsh.
[00:40:18.640 --> 00:40:19.120] Thank you.
[00:40:19.280 --> 00:40:21.680] We've been Skeptics with a K, and we will see you next time.
[00:40:21.680 --> 00:40:22.320] Bye now.
[00:40:22.320 --> 00:40:23.120] Bye.
[00:40:27.920 --> 00:40:33.000] Skeptics with a K is produced by Skeptic Media in association with the Merseyside Skeptic Society.
[00:40:29.920 --> 00:40:41.960] For questions or comments, email podcast at skepticswithakay.org and you can find out more about Merseyside Skeptics at merseyside skeptics.org.uk.
Prompt 2: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 3: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Full Transcript
[00:00:00.640 --> 00:00:06.800] This Labor Day at Value City Furniture, get up to 20% off your new living room and more throughout the store.
[00:00:06.800 --> 00:00:08.640] Think you deserve even more?
[00:00:08.640 --> 00:00:09.280] Same.
[00:00:09.280 --> 00:00:13.200] Like combining those savings with an extra 10% off door busters.
[00:00:13.200 --> 00:00:16.320] Plus, no interest financing for up to 40 months.
[00:00:16.320 --> 00:00:19.520] See if you pre-qualify without touching your credit score.
[00:00:19.520 --> 00:00:23.840] Yep, get more at Value City Furniture while paying less.
[00:00:23.840 --> 00:00:26.800] More style, more quality, more value.
[00:00:26.800 --> 00:00:29.280] We are Value City Furniture.
[00:00:36.960 --> 00:00:45.840] It is Thursday, the 24th of July, 2025, and you're listening to Skeptics with a K, the podcast for science, reason, and critical thinking.
[00:00:45.840 --> 00:00:56.720] Skeptics with a K is produced by Skeptic Media in association with the Merseyside Skeptic Society, a non-profit organization for the promotion of scientific skepticism on Merseyside around the UK and internationally.
[00:00:56.720 --> 00:00:58.080] I'm your host, Mike Hall.
[00:00:58.080 --> 00:00:59.120] With me today is Marsh.
[00:00:59.120 --> 00:00:59.600] Hello.
[00:00:59.600 --> 00:01:00.320] And Alice.
[00:01:00.320 --> 00:01:01.280] Hello.
[00:01:01.600 --> 00:01:09.520] So we've spoken on the show several times about the way large internet companies use the web to watch what you're doing.
[00:01:09.520 --> 00:01:21.920] Companies like Google and Facebook aggressively will take steps to monitor your online activities, see which websites you're looking at, how long you're looking at them for, how often you're looking at them.
[00:01:21.920 --> 00:01:31.520] And then they match these findings with personal information about you, your name, your address, your date of birth, et cetera, and aggregate all of this together and then they sell that to advertisers.
[00:01:31.520 --> 00:01:33.200] Is that something Amazon's doing as well?
[00:01:33.520 --> 00:01:34.400] Amazon does this as well.
[00:01:36.080 --> 00:01:39.920] And that advertising is not necessarily just for products and services.
[00:01:39.920 --> 00:02:01.640] I think it's bad enough if it was just for products and services, but it's also used for political advertising, which knowingly directs disinformation and misinformation at vulnerable people in order to sway their political opinions and targeting people who are likely to receive that message well and be influenced by it by using that information, those profiles that they've given up.
[00:01:59.680 --> 00:02:05.560] And that sways their political positions and therefore their vote.
[00:01:59.840 --> 00:02:06.040] Yep.
[00:02:07.240 --> 00:02:18.120] And much of this criticism has been leveled at Facebook, who have repeatedly and systematically worked to sidestep the controls that were put in place to protect individual privacy.
[00:02:18.120 --> 00:02:23.480] So they wiretapped their customers using a VPN to spy on their Snapchat and YouTube data.
[00:02:23.480 --> 00:02:32.120] They lied to users who gave them their phone number by saying this is just for you to log into your account, but then they used it for their advertising data anyway.
[00:02:32.440 --> 00:02:35.400] They installed trackers on the websites of several U.S.
[00:02:35.400 --> 00:02:41.400] hospitals, including prominent hospitals like Johns Hopkins, which they use to gather people's medical data.
[00:02:41.400 --> 00:02:45.080] And you can see those trackers sending back this is the ailment that they looked for.
[00:02:45.080 --> 00:02:46.360] They were looking for an appointment.
[00:02:46.360 --> 00:02:48.840] We can see that sort of information going back.
[00:02:48.840 --> 00:03:05.800] They also build so-called shadow profiles of non-Facebook users, where they gather information from people who use Facebook and from around the web in order to build a profile of people who aren't even on their platform in order to still be able to sell ads to those people.
[00:03:06.440 --> 00:03:13.000] And many of these things that they do are illegal, and Facebook is hit with scandal after scandal for these kind of privacy violations.
[00:03:13.000 --> 00:03:20.200] And each time they settle out of court and pay a fine, which is far smaller than the amount of money they made by committing the act in the first place.
[00:03:20.200 --> 00:03:32.760] Yeah, and then they go on Rogan and complain that they're being unfairly targeted by the EU who are just trying to crush them and how, yeah, specifically the American government should be easy on them in terms of regulation because it's patriotic to support an American company.
[00:03:32.760 --> 00:03:38.840] It's much like letting a bank robber settle with a thousand pound fine so they don't have to go to trial for their million pound heist.
[00:03:38.840 --> 00:03:41.160] You know, that's very much what Facebook do.
[00:03:41.160 --> 00:03:43.880] But that's not grounds that I'm looking to retread today.
[00:03:43.880 --> 00:03:51.200] In fact, I'm not even going to talk about Facebook's latest thing, which is using Facebook and Instagram to steal your private photos.
[00:03:51.200 --> 00:03:58.320] Which is to say, photos you didn't upload to Facebook or Instagram, they just take them off your phone and use them to train their AI.
[00:03:58.320 --> 00:04:01.520] So you have the app installed, it has permission to see your camera roll.
[00:04:01.520 --> 00:04:05.360] You didn't upload those pictures, but it'll take them and use them for their AI anyway.
[00:04:05.360 --> 00:04:10.320] I don't have the Facebook app installed, even if I rarely use Facebook, it's not installed anymore.
[00:04:10.320 --> 00:04:20.880] I'm not even going to talk about the news reports where Facebook's AI on WhatsApp gave the personal phone number of another WhatsApp user out in response to a query about railway timetables.
[00:04:20.880 --> 00:04:23.680] Somebody said, Where can I speak to this railway company?
[00:04:23.680 --> 00:04:29.280] And they sent him the number of a completely unrelated guy who was also a WhatsApp user.
[00:04:29.280 --> 00:04:29.760] Right.
[00:04:29.760 --> 00:04:30.800] I'm not going to talk about that either.
[00:04:31.040 --> 00:04:32.320] We can do a big deep dive on that.
[00:04:32.320 --> 00:04:33.680] That's not where we're going to go.
[00:04:33.680 --> 00:04:40.560] We have talked in the past about the persistent conspiracy theory that your mobile phone is listening to everything you say.
[00:04:40.880 --> 00:04:49.360] And companies like Facebook or Google or Apple or Amazon or others use these recordings to add into their advertising data.
[00:04:49.360 --> 00:04:55.840] And this is an incredibly pervasive idea to the point where people aren't even shocked at the suggestion.
[00:04:55.840 --> 00:04:57.120] It's kind of taken for granted.
[00:04:57.120 --> 00:04:58.560] Yeah, yeah, just assumes it's true.
[00:04:58.560 --> 00:04:59.680] This is true.
[00:04:59.680 --> 00:05:12.320] It's not uncommon at the moment to see videos on TikTok or YouTube or Instagram of people, it's frequently young women, deliberately whispering things to their boyfriend's phone to try and make it show him ads for that thing.
[00:05:12.320 --> 00:05:22.560] So she'll recall you'll see women doing little skits or sketches for their TikTok where they're whispering, engagement ring, holiday in Marbea, to their boyfriend's phone.
[00:05:22.560 --> 00:05:26.960] Which is a gag, but a gag based on the shared apparent understanding that this is a real thing that happens.
[00:05:26.960 --> 00:05:32.600] It is just a little comedy bit, but for the bit to work, you have to accept that it's true, as you say.
[00:05:33.560 --> 00:05:38.920] And historically on this show, we have said that this sort of surveillance doesn't exist.
[00:05:38.920 --> 00:05:45.000] There is no widespread effort to covertly record everything you say and upload it to Facebook or Google or whatever.
[00:05:45.320 --> 00:05:48.280] But maybe I'm going to have to eat my words on that.
[00:05:48.280 --> 00:05:54.360] Maybe I'm going to have to chow down on some humble pie and then record myself doing it and upload it to Instagram.
[00:05:54.360 --> 00:06:00.600] Because a company called the Cox Media Group has claimed that they are doing exactly that.
[00:06:00.920 --> 00:06:14.600] So from their website, they say a smart device is likely within earshot when we talk about our plans for the weekend, how badly we need our kitchen remodeled, or debate which SUV is best for the family with our spouse.
[00:06:14.600 --> 00:06:16.360] None of them don't get an SUV.
[00:06:16.360 --> 00:06:18.280] You are destroying the planet.
[00:06:18.280 --> 00:06:26.680] When small businesses know who needs them, they can target ads with enhanced accuracy, waste less money, and grow their audience.
[00:06:26.680 --> 00:06:28.280] Creepy question mark?
[00:06:28.280 --> 00:06:29.000] Sure.
[00:06:29.000 --> 00:06:30.520] Great for marketing?
[00:06:30.520 --> 00:06:31.960] Definitely.
[00:06:32.280 --> 00:06:33.080] That's the pitch.
[00:06:33.240 --> 00:06:35.000] Okay, so they're selling me as a service.
[00:06:35.160 --> 00:06:37.560] They say, we do this, we're unashamed about it.
[00:06:37.560 --> 00:06:38.760] We will do this for you.
[00:06:38.760 --> 00:06:43.800] I feel like we just accept advertising as an excuse for everything.
[00:06:43.800 --> 00:06:44.120] Yeah.
[00:06:44.120 --> 00:06:51.880] That, like, we've never really challenged the fact that, like, we challenge that there's too much advertising, that advertising is an issue, that capitalism is an issue, blah, blah.
[00:06:52.120 --> 00:06:56.360] But we haven't challenged the existence of advertising, full stop.
[00:06:56.360 --> 00:06:59.480] We've just accepted that as the start point.
[00:06:59.480 --> 00:07:02.520] And then everything else we complain about is the stuff beyond that.
[00:07:02.520 --> 00:07:05.320] And the intrusions of adverts, obviously, I talked about this on the last show.
[00:07:05.320 --> 00:07:07.480] The intrusion of adverts bother me so much.
[00:07:07.480 --> 00:07:17.520] And the thing that amazes me is how adverts of a different genre bypass any of your flags of, oh, this is a bad thing, or here are the rules that we have.
[00:07:17.520 --> 00:07:21.920] What struck me was Gary Lineker has a podcast, a very big podcaster, empire all about football.
[00:07:14.760 --> 00:07:24.160] He, Alan Shearer, will be on there.
[00:07:24.160 --> 00:07:28.000] Because it's a podcast, they'll get podcast advertising where they'll do horse red ads.
[00:07:28.000 --> 00:07:34.160] And Gary Lineker and Alan Shearer will read out the ads for fucking whatever meal preparation program.
[00:07:34.160 --> 00:07:38.560] And it strikes me that Alan Shearer used to do ads for McDonald's and Gillette.
[00:07:38.560 --> 00:07:44.000] And if you wanted Alan Shearer in your ad, I imagine it would cost you hundreds of thousands of pounds.
[00:07:44.000 --> 00:07:45.760] You're one of the famous footballers in the world at the time.
[00:07:46.000 --> 00:07:50.240] It's six or seven figures to have Alan Shearer or indeed Gary Lineker do your ads.
[00:07:50.480 --> 00:07:50.800] Exactly.
[00:07:50.800 --> 00:07:53.840] I mean, Nish Kumar does ad reads on Podsa of the UK.
[00:07:53.840 --> 00:08:02.240] To get Nish Kumar in an ad is going to be thousands and thousands of pounds, but hand him copy, hand these guys copy because they happen to be sat with a microphone.
[00:08:02.240 --> 00:08:05.440] And the convention of podcast advertising is, well, the horse read out ads.
[00:08:05.680 --> 00:08:06.640] And so we'll just do that.
[00:08:06.800 --> 00:08:11.360] We won't negotiate an extra price because of how big a celebrity the blue person is.
[00:08:11.360 --> 00:08:17.520] They just put their whole it's wild to me that people's that the advertising world has shifted in that kind of way and nobody's noticed it.
[00:08:17.520 --> 00:08:19.520] But we don't question anything when it comes to advertising.
[00:08:19.520 --> 00:08:21.600] Advertising is just an excuse for anything.
[00:08:21.600 --> 00:08:25.440] Like, well, like you say, oh, well, is it creepy?
[00:08:25.440 --> 00:08:27.280] Yes, but it's good for marketing.
[00:08:27.360 --> 00:08:29.440] It's great for marketing, so it's good for a time.
[00:08:29.520 --> 00:08:32.000] There's a trend at the moment that's been ongoing for a while.
[00:08:32.000 --> 00:08:34.720] I had a whine about this on Blue Sky recently.
[00:08:34.720 --> 00:08:37.280] It's a model that we refer to as consent or pay.
[00:08:37.280 --> 00:08:38.000] Oh, God, I hate it.
[00:08:38.800 --> 00:08:39.920] I hate it so much.
[00:08:39.920 --> 00:08:45.120] A lot of newspapers are doing this now where it's like you can pay for a subscription or you can have the ads.
[00:08:45.120 --> 00:08:46.240] And, you know, it's up to you.
[00:08:46.240 --> 00:08:47.120] It's up to you which one.
[00:08:47.120 --> 00:08:47.680] It's your choice.
[00:08:47.680 --> 00:08:50.000] The independent's got a thing saying it's your choice.
[00:08:50.000 --> 00:08:52.000] You can choose whichever one you like.
[00:08:52.000 --> 00:08:56.960] And fundamentally, what that comes down to is rich people get to preserve their privacy and poor people don't.
[00:08:57.520 --> 00:08:58.800] That's fundamentally what it comes down to.
[00:08:58.960 --> 00:09:00.680] I would happily accept the ads.
[00:09:00.680 --> 00:09:01.880] I don't want the tracking.
[00:08:59.680 --> 00:09:03.400] That's the problem is the issue.
[00:08:59.840 --> 00:09:04.280] Take those away.
[00:09:04.440 --> 00:09:08.200] Just serve me an ad and I will have that next to the article and I'll happily read that.
[00:09:08.200 --> 00:09:14.120] The way that ads has always worked in the advertising industry, in the newspaper industry, the entirety of the time.
[00:09:14.120 --> 00:09:20.520] But now it has to be, I need to know everything about you so I can send you a picture of things you're still not going to fucking buy, but I get to invade your privacy.
[00:09:20.520 --> 00:09:24.440] And in the last 20 years, ads and tracking have become synonymous.
[00:09:24.440 --> 00:09:28.680] The advertising industry can't seem to cope with the idea of an ad without tracking.
[00:09:29.000 --> 00:09:34.840] Whereas if you go to the 90s, 80s, 70s, no one was tracking whether you watch the ads on fucking Coronation Street or not.
[00:09:35.160 --> 00:09:38.040] No, you just put ads for Coronation Street watchers on Coronation Street.
[00:09:38.280 --> 00:09:38.920] Coronation Street.
[00:09:39.080 --> 00:09:39.800] Yeah, absolutely.
[00:09:39.800 --> 00:09:43.240] No one was tracking which billboard that you were looking at as you drove past it, that sort of thing.
[00:09:43.640 --> 00:09:49.000] And you do a focus group instead of like tracking a load of people and looking at those analytic data.
[00:09:49.000 --> 00:09:53.800] But now it's ads and tracking are synonymous and the industry can't imagine one without the other.
[00:09:53.800 --> 00:09:59.720] To the point where Cox Media Group, as you've said there, Alice, openly acknowledge that their product is creepy.
[00:09:59.720 --> 00:10:04.920] In fact, elsewhere on their website, they compare their product to an episode of Black Mirror.
[00:10:05.320 --> 00:10:08.360] They say, I know it sounds like something out of Black Mirror, but it's here today.
[00:10:08.360 --> 00:10:10.200] The thing is, that's also marketing.
[00:10:10.200 --> 00:10:11.080] That's their U.S.
[00:10:11.240 --> 00:10:12.200] Outrage marketing, yeah.
[00:10:12.200 --> 00:10:12.760] Yeah, exactly.
[00:10:12.760 --> 00:10:14.200] That's their USP.
[00:10:14.200 --> 00:10:20.200] It's that, oh, we will own the fact that this is so outrageous and try and get you to pass this around for that reason.
[00:10:20.200 --> 00:10:28.040] So CMG, Cox Media Group, their pitch listed Facebook, Amazon, and Google as technology partners for this product.
[00:10:28.040 --> 00:10:30.280] And they called this product active listening.
[00:10:31.240 --> 00:10:32.920] That was the product that they're pitching.
[00:10:32.920 --> 00:10:34.680] That's bad as well.
[00:10:34.840 --> 00:10:35.880] Well, yeah, because it's true.
[00:10:35.880 --> 00:10:38.680] Active listening is a therapy term.
[00:10:38.680 --> 00:10:39.480] Oh, I see.
[00:10:39.480 --> 00:10:41.560] Or a coaching term, whatever you want to call it.
[00:10:41.640 --> 00:10:45.840] Yeah, it's when you don't just wait for your turn to speak in a conversation.
[00:10:45.840 --> 00:10:49.520] You're sitting there and listening and engaging with and trying to take on board.
[00:10:44.920 --> 00:10:52.080] Yeah, but it's remarkable how many people don't.
[00:10:52.640 --> 00:10:55.360] You don't have to change the name of it, just get people to do it.
[00:10:55.680 --> 00:11:03.600] So the technology website 404 Media last year published a pitch deck which was used by CMG to advertise their active listening product.
[00:11:03.760 --> 00:11:07.760] Somebody leaked their sales pitch deck to 404.
[00:11:07.760 --> 00:11:16.400] And the pitch deck included claims that they use AI to gather data from hundreds of sources, including Facebook, Amazon, Google, and LinkedIn.
[00:11:16.400 --> 00:11:24.080] And they aggregate that together with behavioral data and voice data to target consumers who are in the market for your product.
[00:11:24.080 --> 00:11:27.120] Okay, that's a different claim to what they said.
[00:11:27.360 --> 00:11:30.720] That sounds like a different claim from what they were saying at the start.
[00:11:30.720 --> 00:11:33.280] Where they're saying that it's all about the voice data.
[00:11:33.280 --> 00:11:34.000] It's active listening.
[00:11:34.400 --> 00:11:38.800] Because even voice data, to say it's voice data, isn't quite the same as constant surveillance.
[00:11:38.800 --> 00:11:39.120] Yeah.
[00:11:39.760 --> 00:11:42.400] They also appear to be hyper-localized.
[00:11:42.560 --> 00:11:45.360] The pitch deck mentions two packages that they sell.
[00:11:45.360 --> 00:11:52.720] One targets consumers who are within 10 miles of your business, and the other targets consumers within 20 miles of your business.
[00:11:52.720 --> 00:11:54.960] Yeah, okay, that's understandable how they could do that.
[00:11:54.960 --> 00:12:02.560] It says, this is a quote from the pitch deck: smart devices capture real-time intent data by listening to our conversations.
[00:12:02.560 --> 00:12:08.320] Advertisers pair this voice data with behavioral data to target in-market consumers.
[00:12:08.320 --> 00:12:13.040] In-market consumers, presumably meaning consumers who are in the market for your product.
[00:12:13.360 --> 00:12:20.160] So somebody's walking nearby and says, oh, I could just have an ice cream, and you'd suddenly get a notification to the ice cream shop that's around the corner.
[00:12:20.160 --> 00:12:21.920] Yeah, that sort of thing.
[00:12:22.240 --> 00:12:25.200] So there's little ambiguity in this.
[00:12:25.200 --> 00:12:33.960] CMG appears to be confirming this long-standing conspiracy theory, and all those women on TikTok will be getting their engagement rings soon, presumably.
[00:12:34.520 --> 00:12:41.640] But as long as it's within like 10 blocks of 10 blocks of a jeweler who has bought the active listening products.
[00:12:41.640 --> 00:12:43.480] Yeah, they're going to get a really shit engagement ring.
[00:12:43.800 --> 00:12:45.960] A really cheap engagement ring.
[00:12:45.960 --> 00:12:47.640] So I'm still skeptical on this.
[00:12:47.640 --> 00:12:48.200] Yeah.
[00:12:48.200 --> 00:12:56.760] When CMG's claims for their active listening products first came to light, technology firms went out of their way to distance themselves from CMG.
[00:12:56.760 --> 00:13:01.400] Google said, we used to work with CMG, but we've removed them from our partner program.
[00:13:01.400 --> 00:13:05.960] Amazon Flat denied that CMG ever had anything to do with them.
[00:13:05.960 --> 00:13:12.600] And Facebook, of course, did neither of these things and just said that their relationship with CMG is currently under review.
[00:13:12.600 --> 00:13:14.280] So fully non-committal.
[00:13:14.280 --> 00:13:16.680] Oh, they're just going to wait for this to blow over and see what happens.
[00:13:16.920 --> 00:13:18.440] I think Amazon have just lied, though.
[00:13:18.440 --> 00:13:22.120] So at least Facebook have said nothing, but Amazon have lied.
[00:13:22.120 --> 00:13:26.200] But also, like, this is how you get the creep into things like that.
[00:13:26.200 --> 00:13:31.240] I don't want to sound like a conspiracy theorist here, but you throw somebody under the bus and go, oh, people are going to be outraged.
[00:13:31.240 --> 00:13:36.120] And now, the next time we mention it, people stop noticing that it's outrageous because they've heard it before.
[00:13:36.440 --> 00:13:40.120] And now we can creep this stuff through because it actually would be useful.
[00:13:40.440 --> 00:13:48.520] So the incident attracted specific statements from Apple and Google denying that what CMG was claiming was even possible.
[00:13:48.520 --> 00:13:54.920] So Google told Variety, Android prevents apps from collecting audio data when they're not being used.
[00:13:55.480 --> 00:14:02.040] And Apple similarly said, no app can access iPhones, microphone, or camera without your permission.
[00:14:02.040 --> 00:14:02.600] Yes.
[00:14:02.600 --> 00:14:10.520] Apple then went further and said any data collected for Siri is not used to build a marketing profile and we never sell it to anybody.
[00:14:10.520 --> 00:14:22.800] That thing of no app can collect microphone data without your permission is a fucking cop-out because how many apps, A, how many apps that don't need your microphone ask for microphone permissions just because they ask for fucking everything?
[00:14:22.800 --> 00:14:26.960] And B, how many apps do you give permission to use the microphone?
[00:14:26.960 --> 00:14:40.160] Because you're assuming that they'll use it for when you're using the app and when you're specifically recording microphone or using a call or whatever you're using, it would reasonably use the microphone for in the app.
[00:14:40.160 --> 00:15:00.240] And that's my assumption or my guess as to where this is going is that the apps that you've given access to the microphone for, for example, you're putting videos up on Instagram, that gives you access to your voice data that could then be analyzed in any kind of way and then lined up with who you are from the other data that's available.
[00:15:00.240 --> 00:15:06.320] And so they can say, oh, we take all this data and voice data, but it's the voice data from the stuff you've willingly given them.
[00:15:06.320 --> 00:15:25.200] But, you know, it's not like you can just remove microphone privileges from apps because A, you cannot use a mobile phone without allowing some app to use your microphone because it's a fucking phone, but also there are a lot of apps that don't need your microphone that you still can only use if you give microphone privileges to, right?
[00:15:25.200 --> 00:15:26.880] Yeah, delete those ones.
[00:15:27.520 --> 00:15:28.320] So what's going on?
[00:15:28.320 --> 00:15:28.960] What's the story?
[00:15:29.280 --> 00:15:31.680] How are CMG saying we have this product?
[00:15:31.680 --> 00:15:39.840] But Apple and Google, whose software powers almost every phone on the planet, both say what CMG are claiming is not possible.
[00:15:40.480 --> 00:15:44.400] So most smartphones these days come with voice assistant software.
[00:15:44.400 --> 00:15:50.960] And that software is increasingly incorporating AI to assist in the comprehension of the user's request.
[00:15:50.960 --> 00:15:53.920] It's not just smartphones that do this either.
[00:15:53.920 --> 00:15:56.720] So, some televisions have got voice mode now.
[00:15:56.720 --> 00:16:00.600] There's obviously Amazon's line of echo products that they use, the Aletta products.
[00:15:59.600 --> 00:16:02.760] Never have it in my house.
[00:16:02.920 --> 00:16:08.040] Apple has got the HomePod, which is a similar kind of smart speaker product and so on.
[00:16:08.040 --> 00:16:10.200] But they all work in a very similar way.
[00:16:10.200 --> 00:16:13.800] This isn't related to audio, like to microphone stuff at all.
[00:16:13.800 --> 00:16:25.240] But I was writing an email on my laptop the other day, and I never use any of the AI services unless I accidentally click on it when I'm trying to click something else and then immediately close it.
[00:16:25.240 --> 00:16:26.520] Like, I never use any of that.
[00:16:26.520 --> 00:16:34.040] But I was working on this email, and I was on a call at the same time, so I kept pausing to listen to the call and then carrying on with me more.
[00:16:34.040 --> 00:16:37.000] So it was taking me a while to write this email because I was doing it in short bursts.
[00:16:37.000 --> 00:16:43.080] And after a couple of minutes of this, I got a pop-up on the side of my computer that said, Do you need help writing this email?
[00:16:43.080 --> 00:16:47.000] I was like, I do not have, I don't use AI on my laptop.
[00:16:47.000 --> 00:16:50.520] I'm using a web browser email service.
[00:16:50.760 --> 00:16:55.960] Like, you should not be looking at that and seeing that it's taking me a while to write an email and be offering me helping.
[00:16:56.040 --> 00:16:57.560] And then trying to quote a group help.
[00:16:57.960 --> 00:16:59.320] This is ridiculous.
[00:16:59.480 --> 00:17:00.520] It's so infuriating.
[00:17:00.520 --> 00:17:10.360] I mean, so with the math projects that I work on through Good Thinking, the Parallel Academy stuff, we have quite a few Google subscriptions because we have to have a Gmail inbox and stuff.
[00:17:10.360 --> 00:17:19.000] And recently we got an email from Google saying, We're putting up the cost of your prescription, your subscription, because each month you're now getting access to powerful AI tools.
[00:17:19.320 --> 00:17:20.760] Well, I don't want those tools.
[00:17:20.760 --> 00:17:24.520] You're just charging me for now for stuff that I don't want that I don't think should exist.
[00:17:24.520 --> 00:17:25.160] Yeah.
[00:17:25.160 --> 00:17:39.880] And Google recently announced that I was going to integrate this with the story and didn't, but Google recently announced that their Gemini AI is going to be enabled by default on your Android phone and will have access to your WhatsApp messages when it does that.
[00:17:39.880 --> 00:17:41.400] Jesus Christ.
[00:17:41.400 --> 00:17:42.280] So, yeah.
[00:17:42.600 --> 00:17:46.880] Anyway, these voice assistant services all work in roughly the same way.
[00:17:47.040 --> 00:17:53.120] So these devices are always listening, they are always monitoring, and they are listening for a specific phrase.
[00:17:53.120 --> 00:17:53.440] Yes.
[00:17:53.600 --> 00:17:59.680] And this phrase you might refer to as the wake word, which tells the device you have a query or an instruction for it.
[00:17:59.680 --> 00:18:02.640] So on Apple devices, that's Siri or Hey Siri.
[00:18:02.640 --> 00:18:04.480] On Amazon Echo, it's Alexa.
[00:18:04.480 --> 00:18:06.240] For Google, it's OK Google.
[00:18:06.240 --> 00:18:08.720] Everyone's phone is just lit up listening to this now.
[00:18:09.360 --> 00:18:24.960] And when the device hears the wake word, it records a short section of audio and sends that over the internet to a machine which will interpret what you have said, returns that interpretation to your device, and your device can then act on the text.
[00:18:24.960 --> 00:18:32.000] The request, saying, oh, we asked what time it is, or to set a reminder, or to read notifications, or whatever.
[00:18:33.280 --> 00:18:40.000] But it would be misleading to say that your device is always listening in any meaningful sense.
[00:18:40.000 --> 00:18:44.160] Because for one thing, the devices are only listening for the trigger word, the wake word.
[00:18:44.160 --> 00:18:46.240] They aren't recording everything you say.
[00:18:46.240 --> 00:18:49.760] They're looking for a specific acoustic pattern.
[00:18:49.760 --> 00:19:38.040] And only when they detect that pattern will they it's like it's like characterizing your smoke detector as always sniffing right yeah it kind of doesn't actually it that's not a reasonable characterization of what it's doing and all of that takes place on the device listening for the wake word is something that the device does on its own and only when it hears it starts recording and then sends that recording off device for processing yeah and the thing is, it's like when we talked about this the last time that we touched on this topic, we know that the phone isn't constantly sending everything to the server because the battery power that that would take, your battery would not last if it's if it's yeah not just recording everything, but also transmitting that, you'd notice that as you're walking around your data, if you had any kind of limited data plan, you would notice that you're constantly doing that.
[00:19:38.040 --> 00:19:39.960] So it can't possibly be happening.
[00:19:39.960 --> 00:19:42.360] And we will come back to that point in a minute.
[00:19:42.360 --> 00:19:43.000] Okay.
[00:19:43.000 --> 00:19:46.760] So your phone isn't listening in the sense of having any comprehension of the words you're saying.
[00:19:46.760 --> 00:19:49.160] It's just pattern matching for the trigger phrase.
[00:19:49.160 --> 00:19:59.000] Now, it's a fair criticism to say, yeah, Mike, that's fine, but you only say that's what's happening because that's what they've told you is what's happening and they could be lying about it.
[00:19:59.000 --> 00:19:59.800] And maybe that's true.
[00:19:59.800 --> 00:20:02.760] Maybe it doesn't work like this at all and they're covering it up.
[00:20:02.760 --> 00:20:04.520] So how would we know?
[00:20:04.840 --> 00:20:18.440] So back in 2019, a group of researchers working at a cybersecurity firm called Wondera conducted a controlled experiment to see if they could find evidence of apps on your phone covertly recording conversations.
[00:20:18.760 --> 00:20:22.840] Now, before I describe the experiment that they did, this was not peer-reviewed.
[00:20:22.840 --> 00:20:25.000] It was only published on their website.
[00:20:25.000 --> 00:20:29.000] So with that caveat, use as much salt as you need in order to do this.
[00:20:29.000 --> 00:20:29.560] Personally, I'm not sure.
[00:20:29.800 --> 00:20:37.080] This is industry stuff rather than cybersecurity stuff.
[00:20:38.360 --> 00:20:41.880] So they took two smartphones, an Apple device and a Samsung device.
[00:20:41.880 --> 00:20:44.360] And the Samsung device was running Android.
[00:20:44.360 --> 00:20:47.560] And they put those phones in a room for three days.
[00:20:47.560 --> 00:20:48.200] Okay.
[00:20:48.200 --> 00:20:56.120] Both those devices had Facebook installed, Instagram installed, Chrome installed, Snapchat, YouTube, and Amazon.
[00:20:56.120 --> 00:20:57.240] This is pre-TikTok.
[00:20:58.200 --> 00:21:00.120] So this was before TikTok was huge.
[00:21:00.120 --> 00:21:00.760] Yeah.
[00:21:00.760 --> 00:21:04.920] Those apps were given full permission to access the microphone on the device.
[00:21:04.920 --> 00:21:07.880] So no restriction, no, only when the app is running.
[00:21:08.120 --> 00:21:10.680] Full permission on the microphone.
[00:21:11.000 --> 00:21:17.600] Every day for three days, they played a pet food ad into the room for half an hour.
[00:21:14.760 --> 00:21:18.960] And then it would stop.
[00:21:19.280 --> 00:21:25.200] As a control, there was also a quiet room where the devices were left in silence for three days.
[00:21:26.720 --> 00:21:30.560] In both cases, the devices were left switched on and locked.
[00:21:30.560 --> 00:21:32.480] So on the lock screen, switched on.
[00:21:32.480 --> 00:21:33.680] As you carry them around, yeah.
[00:21:33.680 --> 00:21:35.840] And who had the keys to the room?
[00:21:35.840 --> 00:21:39.040] That wasn't in the description that they put out there.
[00:21:39.040 --> 00:21:48.160] I can imagine a world where someone goes, oh, but like you're having a workplace romance and that room's locked and just quiet for three days.
[00:21:48.560 --> 00:21:52.160] You know, we can go in there and now you're talking to the phones.
[00:21:52.480 --> 00:21:53.200] So after the experiment.
[00:21:54.000 --> 00:21:55.760] Talking is what you're doing in there.
[00:21:56.400 --> 00:21:57.120] After the experiment.
[00:21:57.200 --> 00:21:59.920] Why is it bringing a batch for Squelch?
[00:21:59.920 --> 00:22:02.400] How is Squelch a product?
[00:22:03.360 --> 00:22:05.440] After the experiment, they measured three things.
[00:22:05.440 --> 00:22:09.440] One, how much battery was used in the pet food room versus the quiet room.
[00:22:10.000 --> 00:22:13.920] Two, how much data was used in the pet food room versus the quiet room.
[00:22:13.920 --> 00:22:17.840] And three, did any ads for pet food show up on the devices?
[00:22:18.800 --> 00:22:26.320] And what they found was that for both Android and Apple devices, no fucking difference in data use between the two rooms at all.
[00:22:27.040 --> 00:22:34.480] The data that was used was minimal, less than a megabyte, and in most cases, significantly less than a megabyte.
[00:22:34.480 --> 00:22:42.080] They had a control which was actually saying, hey, Siri, or okay, Google to these things to see how that's my phone's just lit up when I've said that.
[00:22:42.480 --> 00:22:46.080] Or saying okay, Google to one of these things to see how much data that used.
[00:22:46.080 --> 00:22:46.800] And it was huge.
[00:22:46.800 --> 00:22:49.600] It was like 30 megabytes, 100 megabytes.
[00:22:49.600 --> 00:23:02.280] So, that's a slightly unfair characterization because what they did was assume that a recording of the full 30 minutes would use the same amount of data per second as a triggered wake word.
[00:22:59.840 --> 00:23:06.600] And so they extrapolated how much it would expect to be over that time.
[00:23:06.920 --> 00:23:12.040] But yeah, there was minimal data use and no difference between the quiet room and the pet food room.
[00:23:12.040 --> 00:23:13.320] So that's a pretty good test.
[00:23:13.320 --> 00:23:20.840] They measured the data use of the phones by connecting them to their Wi-Fi so they could see whether the phones were making requests out.
[00:23:20.840 --> 00:23:27.720] But of course, maybe the phones are smart about it and they're actually uploading the data using the 5G connection or something like that.
[00:23:27.880 --> 00:23:30.680] Say, oh, when we're spying on people, don't use their Wi-Fi.
[00:23:30.680 --> 00:23:31.560] They'll rumble.
[00:23:31.560 --> 00:23:33.160] Use the cellular modem instead.
[00:23:33.160 --> 00:23:36.520] I think you'd be more likely to notice that if you had any kind of limited data plan.
[00:23:36.520 --> 00:23:41.480] But also, there was no difference in battery use between the silent room and the pet food room.
[00:23:41.480 --> 00:23:46.200] If they were uploading masses of audio data using the cellular modem, that's going to drain the battery.
[00:23:46.200 --> 00:23:47.080] Were they brand new phones?
[00:23:47.080 --> 00:23:47.800] Sorry, did you say that?
[00:23:47.800 --> 00:23:48.040] Sorry?
[00:23:48.120 --> 00:23:48.840] They were brand new phones.
[00:23:48.840 --> 00:23:50.440] Yeah, they were new phones.
[00:23:50.440 --> 00:23:53.160] So actually, there was no difference between the rooms at all.
[00:23:53.160 --> 00:24:01.480] And neither of the devices started showing ads for pet food after the test, either in apps or on the web.
[00:24:01.800 --> 00:24:03.560] Now, as I say, that's not peer-reviewed.
[00:24:03.560 --> 00:24:05.480] It's very much a back-of-the-envelope experiment.
[00:24:05.480 --> 00:24:11.400] It's the kind of thing we could probably put together at MSS, is the kind of test that we might do, like we did with the shoesy band.
[00:24:11.400 --> 00:24:12.840] Yeah, we just couldn't afford the phones.
[00:24:13.480 --> 00:24:16.920] But nevertheless, I found the findings compelling on this.
[00:24:16.920 --> 00:24:26.840] They found no evidence of any of those apps covertly recording using the microphone, even when the microphone was granted full permission on those apps.
[00:24:27.160 --> 00:24:29.480] Now, admittedly, this is from 2019.
[00:24:29.480 --> 00:24:30.920] This is pre-TikTok.
[00:24:30.920 --> 00:24:35.240] CMG have only started promoting their active listening products over the last couple of years.
[00:24:35.480 --> 00:24:41.400] It's possible that Facebook have become worse, or Google have changed their practices, or Amazon have started doing stuff.
[00:24:41.800 --> 00:24:43.320] So maybe things have changed.
[00:24:43.320 --> 00:24:46.160] Maybe they have found a way to do this these days.
[00:24:46.160 --> 00:24:53.600] Now, one possible way you could actually do this is by exploiting the way the pattern matching for the wake word works.
[00:24:53.600 --> 00:25:03.040] So sometimes you say something that isn't the wake word, like, oh, hey, seriously, or okay, cool, or something like that.
[00:25:03.040 --> 00:25:05.040] That the device.
[00:25:05.040 --> 00:25:05.840] Yes.
[00:25:05.840 --> 00:25:14.160] That the device mistakes for the wake word and then will record a section of conversation and upload that to the cloud.
[00:25:14.160 --> 00:25:20.800] It's also possible that when you're making a legitimate request to one of these tools, someone else is having a conversation at the same time in the background.
[00:25:21.120 --> 00:25:24.800] And that then gets recorded and uploaded and ingested into these systems.
[00:25:24.800 --> 00:25:34.000] So there are plausible mechanisms where unexpected segments of your conversation could end up being sent to Google or Amazon or Apple or whatever.
[00:25:34.000 --> 00:25:41.840] But it doesn't seem like an efficient way to run a business analyzing those snippets.
[00:25:41.840 --> 00:25:43.040] Absolutely not.
[00:25:43.040 --> 00:25:56.560] So in a study published in the journal Computer Speech and Language in 2022, researchers found that devices are accidentally triggered like this on average about once an hour by a TV show or a news broadcast or whatever.
[00:25:56.560 --> 00:26:04.960] And in around half of those cases, a significant amount of audio recording, which is over 10 seconds of audio, is uploaded for processing.
[00:26:04.960 --> 00:26:06.320] And that could contain anything.
[00:26:06.560 --> 00:26:06.880] Right.
[00:26:06.880 --> 00:26:11.920] And that's what about 240 minutes if it happens every single hour over the 24-hour period.
[00:26:11.920 --> 00:26:17.120] So it's entirely plausible that Amazon, Google, Apple, or whatever are taking these accidental quotes.
[00:26:17.200 --> 00:26:18.640] Sorry, 240 seconds, not minutes.
[00:26:18.640 --> 00:26:18.960] Sorry.
[00:26:18.960 --> 00:26:19.520] Before I open it.
[00:26:18.920 --> 00:26:20.640] Indeed, before anyone mentioned it.
[00:26:20.640 --> 00:26:26.720] And they could be taking that and saying, well, the user sent this this by mistake, but let's feed it into the ad model anyway.
[00:26:27.040 --> 00:26:28.080] You know, why not?
[00:26:28.080 --> 00:26:30.040] Facebook could be doing the same thing.
[00:26:30.040 --> 00:26:35.480] They don't have their own voice assistant software, but they could take the audio from videos you upload to your Facebook.
[00:26:29.680 --> 00:26:36.200] Like I'm saying, yeah.
[00:26:36.520 --> 00:26:41.880] Or if you just open the camera app, or if you send someone a voice memo or whatever, they could listen to all of these things.
[00:26:41.880 --> 00:26:48.680] But also, like, when they can analyze the data from when you actually say to your phone, okay, Google, show me the nearest ice cream shop.
[00:26:48.920 --> 00:26:55.160] Indeed, like, yeah, there are times when you are giving them the stuff that they can use to market at you.
[00:26:55.160 --> 00:26:58.920] So, all that said, these are technically plausible mechanisms.
[00:26:58.920 --> 00:27:07.720] There is no evidence that this is what was happening, and no evidence that Cox Media Group was using this technique for their active listening product.
[00:27:07.720 --> 00:27:11.800] And in fact, there's good reasons to think that they are not using these techniques.
[00:27:11.800 --> 00:27:14.520] So, first was the price of their product.
[00:27:14.520 --> 00:27:24.280] They were advertising their active listening product as $100 per day for the 10-mile radius product and $200 per day for the 20-mile radius product.
[00:27:24.600 --> 00:27:29.480] That's absurdly cheap for this sort of surveillance advertising product.
[00:27:29.480 --> 00:27:42.840] If it really worked that way, if it really works the way that they were advertising it as working, they would need to spend significantly more money on the technical infrastructure to make it work than they are getting from customers by selling them this product.
[00:27:42.840 --> 00:27:48.200] Because you're thinking from this location in a 10-mile radius, how many phones are in that space?
[00:27:48.200 --> 00:27:56.840] I would need to be crunching data on essentially all of those constantly or accessing something that they're constantly like they could say that they're not the ones doing that.
[00:27:56.840 --> 00:28:00.600] That there's another service out there that's automatically doing that, and they have access to that database.
[00:28:00.600 --> 00:28:05.320] But yeah, this sort of bulk audio processing is not cheap.
[00:28:05.640 --> 00:28:10.120] Second reason why I don't think this is happening was the lack of response from regulators.
[00:28:10.120 --> 00:28:26.960] When 404 Media blew the whistle on this, on what on what Cox Media Group were doing, there were raised eyebrows in the technical press and the software development and cyber security communities, and no appreciable response at all from regulators in the UK, the US, Australia, the EU, nothing.
[00:28:26.960 --> 00:28:28.720] I think the EU is the most significant one.
[00:28:28.880 --> 00:28:35.120] Yes, because there's no response from US regulators.
[00:28:35.440 --> 00:28:37.440] There's no response from US regulators to anything.
[00:28:37.760 --> 00:28:50.480] Well, oddly enough, the closest we got to a response from any regulator was a MAGA Republican asking questions about it and saying, well, Facebook's listening to our stuff and it's a conspiracy in Zuckerberg and et cetera, et cetera, et cetera.
[00:28:50.480 --> 00:28:54.560] And they wrote a message and said, I want an explanation for what's going on here.
[00:28:54.560 --> 00:28:55.920] And then it never went anywhere else.
[00:28:55.920 --> 00:29:00.080] Yeah, because they wrote a message and Mark Zuckerberg wrote a check and then it was all fire.
[00:29:01.360 --> 00:29:12.160] And if Cox Media Group were actually doing what they were doing, I would have expected a much stronger response or indeed any kind of response from regulators and there was none.
[00:29:12.160 --> 00:29:13.680] There was nothing.
[00:29:14.000 --> 00:29:20.160] But finally, and this is the biggest one, this is just not a practical way to do surveillance advertising.
[00:29:20.160 --> 00:29:26.480] Fetching, uploading, storing, processing huge amounts of audio data like this cost a fucking bomb.
[00:29:26.480 --> 00:29:27.120] Yeah.
[00:29:27.120 --> 00:29:31.840] It would cause a noticeable drain on people's battery on their devices.
[00:29:31.840 --> 00:29:37.120] It would burn through meter data plans on people's phones, which is still very common on mobile phone contracts to have.
[00:29:37.120 --> 00:29:40.000] I've got 60 gig of data, 30 gig of data, whatever.
[00:29:40.000 --> 00:29:49.480] It would burn through that, all while offering little to no advantage over all of the existing ways that they've got of doing surveillance advertising cheaper.
[00:29:49.680 --> 00:29:51.520] Yeah, if anything, it would muddy the waters.
[00:29:51.520 --> 00:29:56.240] It would muddy your data with so much noise that it could actually detract from your other methods that were better.
[00:29:56.240 --> 00:30:08.200] Yeah, so the methods they already have, tracking which websites you visit, which searches you do, your IP address, your location, what do you post on Facebook, what do you buy from Amazon, what do you post on Twitter?
[00:30:08.200 --> 00:30:22.280] The combination of the technical impracticality of this, the legal risk of this, and questionable economic benefit makes the systematic audio surveillance practically impossible to implement at any kind of scale.
[00:30:22.280 --> 00:30:27.400] There is no reason why you would do this because all of the other ways of doing it are massively cheaper.
[00:30:27.400 --> 00:30:28.280] Yeah.
[00:30:28.920 --> 00:30:41.400] The reason you might see an ad for a mattress right after talking to your partner about needing a new mattress is a combination of confirmation bias and the existing surveillance advertising mechanisms that are already in place.
[00:30:41.400 --> 00:30:41.640] Yeah.
[00:30:41.640 --> 00:30:44.840] Like maybe you've Googled for a mattress at some point.
[00:30:44.840 --> 00:30:46.360] You've searched Amazon for a mattress.
[00:30:46.520 --> 00:30:49.800] Or your friends sent you a Facebook message about how they just got a new mattress and it was great.
[00:30:49.800 --> 00:30:51.240] And you thought, oh, maybe it's time that I do.
[00:30:51.240 --> 00:30:54.120] But Facebook knows that you've sent you received that message.
[00:30:54.120 --> 00:30:54.440] Yeah.
[00:30:54.440 --> 00:31:04.920] It's going to be that or the fact that half the fucking internet is ad for mattresses these days or meal kits or VPNs or a really exciting new hose pipe, which is a very exciting new hose pipe.
[00:31:05.080 --> 00:31:07.400] Massively exciting news hose pipes.
[00:31:07.400 --> 00:31:13.000] Or it's Thomas Smith advertising insurance, I think, against his will because he did not do that.
[00:31:13.000 --> 00:31:16.440] So people may have heard an advert with Thomas Smith doing that on our part.
[00:31:16.520 --> 00:31:17.240] On the show, yeah.
[00:31:17.240 --> 00:31:20.120] And he is livid because he did not give them permission to use that audio.
[00:31:20.280 --> 00:31:22.600] Yeah, they just clipped it out of a hose.
[00:31:23.560 --> 00:31:24.680] By mistake, apparently.
[00:31:24.680 --> 00:31:26.040] By mistake, they did that.
[00:31:26.040 --> 00:31:26.840] It happens by accident.
[00:31:27.000 --> 00:31:30.040] You accidentally download Thomas's show and edit his ad out and then sell it.
[00:31:30.200 --> 00:31:31.080] And then put it in other shows.
[00:31:31.400 --> 00:31:31.880] Yeah.
[00:31:32.200 --> 00:31:41.160] So, if the data suggests this isn't happening, and we probably can tell it's not happening at least to any sort of scale, it's not likely to be happening at least to any sort of scale.
[00:31:41.160 --> 00:31:43.720] What the fuck was CMG selling to their customers?
[00:31:43.720 --> 00:31:46.560] Yeah, this is the question that was occurring to me as we were going through this.
[00:31:47.200 --> 00:31:49.680] They were just like, Why are they lying about this?
[00:31:49.920 --> 00:31:51.600] What's to gain from that?
[00:31:51.600 --> 00:31:57.040] So, CMG have now removed all references to the active listening products from their website.
[00:31:57.040 --> 00:32:00.240] They denied that they were ever using any audio recordings.
[00:32:00.240 --> 00:32:08.160] Their position is now that they just aggregate data from third parties and they do not use conversations recorded on people's devices.
[00:32:08.160 --> 00:32:14.720] No whistleblowers have come forward, or former employees have come forward with technical details about what CMG were doing.
[00:32:15.040 --> 00:32:21.920] There are no internal documents, technical specifications, no operational procedures have leaked beyond the pitch deck.
[00:32:21.920 --> 00:32:27.680] Google, Apple, and Amazon have all denied selling voice data to anybody, much less to CMG.
[00:32:27.680 --> 00:32:36.800] I suppose Google and Amazon could be feeding their own ad targeting platforms with this accidentally captured data around the wake word, etc.
[00:32:37.120 --> 00:32:43.920] So, they aren't selling the audio recordings, but maybe they're selling the insights gleaned from those audio recordings.
[00:32:43.920 --> 00:32:52.640] But there's no real evidence that that interpretation is true either, other than CMG's initial claims that those were insights that they had.
[00:32:52.960 --> 00:32:56.880] So, for my money, frankly, I think this was just marketing hyperbole.
[00:32:56.880 --> 00:33:02.480] I don't think there was any ever real legitimate technical capability to do this from CMG.
[00:33:02.480 --> 00:33:07.040] I might be proven wrong in time, but right now, I think, frankly, this was just CMG.
[00:33:07.040 --> 00:33:08.640] We're full of shit.
[00:33:12.480 --> 00:33:15.360] So, we had the Mercy Science Skeptic summer picnic.
[00:33:15.360 --> 00:33:15.920] We did.
[00:33:15.920 --> 00:33:16.880] It was lovely.
[00:33:17.120 --> 00:33:24.480] So, as Marsh mentioned on the last show, we've been in a heat wave, and the Saturday before the picnic was horrifically hot.
[00:33:24.640 --> 00:33:25.120] It was bad.
[00:33:25.600 --> 00:33:29.520] And luckily, the Sunday that we actually had the picnic on was really temperate.
[00:33:29.520 --> 00:33:30.000] It was lovely.
[00:33:30.200 --> 00:33:30.760] Yeah, it was lovely.
[00:33:30.760 --> 00:33:33.080] We had a lot of people turn up who we hadn't seen before.
[00:33:33.800 --> 00:33:35.640] People who might even come back, people listeners.
[00:33:35.640 --> 00:33:37.400] In fact, people who listen to the show who will be able to do it.
[00:33:37.480 --> 00:33:38.920] We've seen some listeners there, which was nice.
[00:33:39.240 --> 00:33:47.720] The board meeting we had before the picnic, we put contingency plans in place saying if the weather is really bad, then we might have to cancel a picnic or pull the picnic or whatever.
[00:33:47.720 --> 00:33:50.920] I didn't anticipate we might have to consider pulling it because it's too hot.
[00:33:51.640 --> 00:33:59.560] We did have to have some, we had a board member and an attendee bring emergency coverings like a gazebo and a thing.
[00:33:59.720 --> 00:34:01.080] We had a lot of ice.
[00:34:01.080 --> 00:34:08.920] But yeah, Phil went out to Costco and bought us a load of stuff and he bought, much to Alice's insistence more than anything, a nice big Costco cake with loads of beef.
[00:34:09.800 --> 00:34:10.520] Beautiful Costco cake.
[00:34:10.680 --> 00:34:11.720] I love a Costco cake.
[00:34:11.720 --> 00:34:24.760] It's a really nice cake with a really nice buttercream icing and not the fucking cream cheese icing that's everywhere these days, which I don't like, but like a proper frothy, fluffy buttercream icing.
[00:34:24.760 --> 00:34:25.720] I didn't like the icing.
[00:34:26.200 --> 00:34:27.000] I did not like the icing.
[00:34:27.240 --> 00:34:29.000] You had a piece that had far too much icing on it.
[00:34:29.320 --> 00:34:31.240] I want an icing to be slightly firmer than that.
[00:34:31.720 --> 00:34:33.560] It was more like a whipped cream than an icing for me.
[00:34:33.800 --> 00:34:48.440] It's a delicious cake, but because they're fucking massive, and I'm basically the only person I know who loves Costco cakes, I never get them because I can't eat an entire cake to myself more quickly than it will go off and stale and horrible.
[00:34:48.760 --> 00:34:51.240] And Costco will put a pattern on your cake as well.
[00:34:51.240 --> 00:34:52.440] They'll print a pattern on the top.
[00:34:52.840 --> 00:34:56.840] He stuck it on when we arrived, and it's why it was slightly off-centre.
[00:34:56.840 --> 00:35:01.480] Yeah, so he tried to put it on, and I refused to look at it because I said, if I look at that, it's going to piss me off.
[00:35:01.480 --> 00:35:02.120] Oh, it was off to Central Coke.
[00:35:02.200 --> 00:35:03.240] It wasn't on the wonk at all.
[00:35:03.800 --> 00:35:05.080] It was beautifully straight.
[00:35:05.560 --> 00:35:06.360] It just wasn't centered.
[00:35:06.520 --> 00:35:08.120] It wasn't perfectly centered.
[00:35:08.120 --> 00:35:09.880] It was a little bit off to one side.
[00:35:09.880 --> 00:35:11.320] So Phil came up with this plan.
[00:35:11.320 --> 00:35:15.840] He said, what we'll do is we'll start cutting the cake from the right-hand side.
[00:35:14.920 --> 00:35:17.840] And then at some point, it looks even.
[00:35:18.160 --> 00:35:23.520] And then eventually we cut enough cake and no one knows that it wasn't centered the whole time.
[00:35:23.520 --> 00:35:24.560] That's brilliant.
[00:35:24.560 --> 00:35:29.200] So when we came to cut the cake, what happened was we ignored that and started cutting it from the left-hand side.
[00:35:29.200 --> 00:35:30.400] Alice started cooking it from the corner.
[00:35:30.640 --> 00:35:40.720] What's especially impressive about this epic failure is that not only did I start cutting because I do automatically do things left to right.
[00:35:40.720 --> 00:35:41.840] I read left to right.
[00:35:41.840 --> 00:35:42.480] I do everything.
[00:35:42.480 --> 00:35:44.320] When I'm organizing things, I do it left to right.
[00:35:44.320 --> 00:35:46.320] When I put my contact lenses in, I go left to right.
[00:35:46.320 --> 00:35:47.680] Like, I do everything left to right.
[00:35:47.680 --> 00:35:56.240] So I just habitually went left to right, despite the fact that that required me to use my left hand when I'm right-handed to cut the cake.
[00:35:56.240 --> 00:36:00.960] And you have to move someone out of the way who has stood there because that's obviously that's where you wouldn't be cutting.
[00:36:00.960 --> 00:36:03.760] If you were to cut an aubergine, would you cut it left to right?
[00:36:04.560 --> 00:36:04.960] What?
[00:36:04.960 --> 00:36:05.520] No?
[00:36:06.000 --> 00:36:06.960] You said you did everything left.
[00:36:07.200 --> 00:36:07.520] You would do that.
[00:36:07.680 --> 00:36:09.040] You said you did everything left to right.
[00:36:09.040 --> 00:36:09.600] No, that's true.
[00:36:09.600 --> 00:36:10.000] I would do it.
[00:36:10.320 --> 00:36:12.720] You cut an aubergine, you'd hold it with your left hand and you'd cut from the right.
[00:36:12.800 --> 00:36:13.760] You could cut from the right, yeah.
[00:36:14.160 --> 00:36:15.120] And you were cutting.
[00:36:15.120 --> 00:36:16.240] This was using a knife.
[00:36:16.480 --> 00:36:18.880] This was much more like cutting an aubergine than writing something.
[00:36:20.160 --> 00:36:21.200] I don't know.
[00:36:21.200 --> 00:36:22.720] It just went wrong.
[00:36:22.720 --> 00:36:25.520] It ended up crooked because I was using my left hand.
[00:36:25.920 --> 00:36:28.080] I didn't want to be cutting the cake.
[00:36:28.080 --> 00:36:30.160] I said I waited.
[00:36:30.160 --> 00:36:36.160] I wasn't getting the cake started because I didn't want to be left with the responsibility of once you cut one piece of cake.
[00:36:36.160 --> 00:36:36.320] You've got to get a cut.
[00:36:36.480 --> 00:36:38.160] You've got to give everybody a cake cake.
[00:36:38.160 --> 00:36:39.760] And I didn't want to do that.
[00:36:39.760 --> 00:36:43.920] So I ended up doing something I didn't want to do and doing it badly.
[00:36:44.160 --> 00:36:45.600] Very, very badly.
[00:36:45.600 --> 00:36:48.880] So we had this cake with a beautiful MSS logo on the top.
[00:36:48.880 --> 00:36:51.200] And Alice, you cut yourself off a slice of cake.
[00:36:51.200 --> 00:36:54.480] You didn't want to start being the cake dispenser for everyone.
[00:36:54.640 --> 00:36:55.440] So you had to slice it.
[00:36:55.680 --> 00:36:56.160] But I did.
[00:36:56.160 --> 00:36:56.720] But I did.
[00:36:56.720 --> 00:36:57.240] I did.
[00:36:56.800 --> 00:36:58.160] You did a couple of slices.
[00:36:58.400 --> 00:36:59.480] Eight or nine slices of cake.
[00:36:59.640 --> 00:37:00.440] A big cake like that.
[00:36:59.120 --> 00:37:02.360] You do one stripe down the full length of it.
[00:37:02.360 --> 00:37:03.320] You cut that into pieces.
[00:36:59.280 --> 00:37:04.840] There's the cake that gets wiped.
[00:37:05.000 --> 00:37:07.640] And then someone else comes along when there's none left and does the next stripe.
[00:37:07.640 --> 00:37:08.040] Yes.
[00:37:08.200 --> 00:37:14.760] And then Warren, you know, very helpful went in and started cutting out slices of cake for people and making little piles of cake so people could come up.
[00:37:14.760 --> 00:37:15.960] And then he stopped.
[00:37:15.960 --> 00:37:21.960] And he stopped about a third of the way through the cake, having cut the M off the MSS logo.
[00:37:22.280 --> 00:37:24.600] So now we had a cake that just said SS.
[00:37:24.600 --> 00:37:26.120] And it was SS on a wonk as well.
[00:37:26.120 --> 00:37:30.280] It was a lot of five SS because there was loads of space either side with no rice paper.
[00:37:30.600 --> 00:37:33.560] And I'm sitting there bald with a skin edge.
[00:37:33.800 --> 00:37:36.200] I'm sitting there bald with a skin edge.
[00:37:36.440 --> 00:37:38.680] We've got a cake that says SS on it.
[00:37:38.680 --> 00:37:40.280] It was a really bad look.
[00:37:40.280 --> 00:37:40.680] It was.
[00:37:41.480 --> 00:37:42.840] We need to eat more cake.
[00:37:42.840 --> 00:37:43.080] Yep.
[00:37:43.080 --> 00:37:45.000] And we had a flag that looked like a rifle as well.
[00:37:45.000 --> 00:37:47.640] We did not have hands at all.
[00:37:52.440 --> 00:37:55.000] So for QED, tickets are still sold out.
[00:37:55.000 --> 00:38:00.840] They've not magically appeared more tickets, but you can get an online ticket to QED for ยฃ49.
[00:38:00.840 --> 00:38:03.080] You can do that by going to QEDcon.org.
[00:38:03.080 --> 00:38:11.160] And that gets you access to the live stream of the main stage, of the panel room, and of the live podcast room for the whole weekend.
[00:38:11.160 --> 00:38:14.280] And yes, you can go and buy that now at QEDcon.org.
[00:38:14.280 --> 00:38:14.680] You can.
[00:38:14.680 --> 00:38:16.040] It's a very straightforward process.
[00:38:16.360 --> 00:38:17.000] It'll be lots of fun.
[00:38:17.000 --> 00:38:21.720] You'll also find all the latest updates of what we're doing at QED for the final ever QED.
[00:38:21.960 --> 00:38:22.680] Final ever ed.
[00:38:22.840 --> 00:38:24.920] Be delighted if you came and took part.
[00:38:24.920 --> 00:38:28.200] Also, we should talk about our Patreon.
[00:38:28.200 --> 00:38:37.400] So if you like what we do, if you enjoy the show and you would like to support us, you can do that by visiting patreon.com forward slash skeptics with a K, where you can donate from as little as a pound a month.
[00:38:37.400 --> 00:38:40.600] And that gets you access to an ad-free version of this show.
[00:38:40.600 --> 00:38:43.400] We never listen to what you say when you're listening to the show.
[00:38:43.400 --> 00:38:45.680] Otherwise, you just get a lot of ads with swearing in them.
[00:38:46.880 --> 00:38:50.960] I imagine like we swear a lot, and you're probably going, Listen, these fucking wankers.
[00:38:44.920 --> 00:38:51.840] So don't listen.
[00:38:52.080 --> 00:38:54.480] We don't listen to your microphone when you listen to the show.
[00:38:54.480 --> 00:38:58.640] But yeah, you get an ad-free version of this show from as little as a pound a month.
[00:38:58.640 --> 00:39:05.200] And that's where you can support all the long hours that we put into actually writing and recording the show.
[00:39:05.200 --> 00:39:07.920] Especially in the fucking hot weather, Christ.
[00:39:07.920 --> 00:39:09.600] So, yes, you can do that.
[00:39:09.600 --> 00:39:12.640] And if you're not able to do that, of course, you can listen to the show with ads.
[00:39:12.640 --> 00:39:14.240] That also helps us out as well.
[00:39:14.240 --> 00:39:20.560] And the other thing that can help us out is if you leave a review for the show, that is a very helpful thing for you to do as well.
[00:39:20.560 --> 00:39:25.040] I found out what Apple's algorithm is for where you go in the iTunes charts.
[00:39:25.040 --> 00:39:25.680] Oh, really?
[00:39:25.680 --> 00:39:30.960] And it's the number of new subscribers per in the last seven days.
[00:39:31.280 --> 00:39:40.640] So as you put on, if you're if you get a lot of new subscribers all at once, that's when you go up the charts and you garner more subscribers and then go further up the charts and then garner more subscribers.
[00:39:40.640 --> 00:39:45.440] So listeners, what you need to do is go around, steal the phone of your friends and subscribe to the show.
[00:39:45.920 --> 00:39:46.480] 100%.
[00:39:46.720 --> 00:39:50.720] And that's the way to just go around that and increase our number of subscribers.
[00:39:51.040 --> 00:39:55.840] Buy four phones, put two in one room, two in a quiet room, subscribe them all to Skeptics with a K.
[00:39:55.840 --> 00:39:57.200] That's the way to do that.
[00:39:57.200 --> 00:40:04.000] You can also, if you want to support the work of the Mercy Tower Skeptic Society, you could do that at patron.com forward slash Mercy Skeptics.
[00:40:04.000 --> 00:40:08.560] That also is nice and cheap and gets you an ad-free version of this show.
[00:40:08.560 --> 00:40:09.600] I think that's it then.
[00:40:09.600 --> 00:40:10.560] Is that all we've got time for?
[00:40:10.560 --> 00:40:12.000] I think that's all we've got time for.
[00:40:12.000 --> 00:40:13.360] That's all I've got time for.
[00:40:13.680 --> 00:40:17.360] All that remains is for me to thank Alice for coming on today.
[00:40:17.360 --> 00:40:17.760] Thank you.
[00:40:17.760 --> 00:40:18.560] Thank you to Marsh.
[00:40:18.640 --> 00:40:19.120] Thank you.
[00:40:19.280 --> 00:40:21.680] We've been Skeptics with a K, and we will see you next time.
[00:40:21.680 --> 00:40:22.320] Bye now.
[00:40:22.320 --> 00:40:23.120] Bye.
[00:40:27.920 --> 00:40:33.000] Skeptics with a K is produced by Skeptic Media in association with the Merseyside Skeptic Society.
[00:40:29.920 --> 00:40:41.960] For questions or comments, email podcast at skepticswithakay.org and you can find out more about Merseyside Skeptics at merseyside skeptics.org.uk.