Debug Information
Processing Details
- VTT File: skeptoid-3107.vtt
- Processing Time: September 11, 2025 at 02:40 PM
- Total Chunks: 1
- Transcript Length: 56,292 characters
- Caption Count: 437 captions
Prompts Used
Prompt 1: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 1 of 1 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
[00:00:03.120 --> 00:00:04.320] Hey, it's Brian.
[00:00:04.320 --> 00:00:10.720] Today I'm doing things a little differently and bringing you a preview of a new audiobook I think you'll really like.
[00:00:10.720 --> 00:00:23.840] It's called Douglas Adams The Ends of the Earth, and it celebrates the wit and wisdom of the legendary science fiction author of Hitchhiker's Guide to the Galaxy and Dirk Gently's Holistic Detective Agency.
[00:00:23.840 --> 00:00:35.040] Douglas Adams was someone who thought deeply about the biggest problems in the world, from the internet to artificial intelligence to space exploration, politics, and conservation.
[00:00:35.040 --> 00:00:42.800] He was a sharp critic and a profoundly disruptive thinker of the way we do things, just like us here at Skeptoid.
[00:00:42.800 --> 00:00:53.440] We've often covered the many ways technology has been shaped and shifted over time, so we think you'll enjoy hearing Adams' thoughts on tech in this preview.
[00:00:53.440 --> 00:01:01.280] It's a journey into the mind of a man who foresaw the technological age in all its wonder and terror.
[00:01:01.600 --> 00:01:03.120] So here is the preview.
[00:01:03.120 --> 00:01:05.520] We hope you enjoy it as much as we did.
[00:01:05.520 --> 00:01:17.760] If you do, you can get Douglas Adams' The Ends of the Earth now at Audible, Spotify, Pushkin.fm/slash audiobooks, or wherever audiobooks are sold.
[00:01:21.600 --> 00:01:22.800] Chapter 3.
[00:01:22.800 --> 00:01:24.560] Everything is Connected.
[00:01:24.560 --> 00:01:26.880] The Internet We Didn't Get.
[00:01:27.200 --> 00:01:35.840] The computer scientist Danny Hillis came up with a brilliant definition of technology, which is technology is stuff that doesn't work yet.
[00:01:37.440 --> 00:01:43.600] Now, the interesting thing is everything we've invented was at some point technology.
[00:01:43.600 --> 00:01:46.640] I mean, a chair was technology before we'd figured it out.
[00:01:46.640 --> 00:01:51.760] Now we know what a chair is, but before we did, you know, people were trying to figure out, well, how many legs should it have?
[00:01:51.760 --> 00:01:52.960] How high should it be?
[00:01:53.200 --> 00:01:55.760] And until we got it, they used to crash.
[00:01:56.080 --> 00:02:02.520] Now, of course, we know everything you need to know about a chair, so we no longer think of chairs as technology, we just use them.
[00:02:02.520 --> 00:02:10.200] Now, it's very odd that we still tend to think of bathroom taps as technology, and we keep on reinventing them and reinventing them.
[00:02:10.200 --> 00:02:16.760] Nowadays, particularly in the hotel, you can hardly walk into a washroom without you've got to figure it out all over again.
[00:02:16.760 --> 00:02:17.640] Now, how do I open?
[00:02:18.280 --> 00:02:20.120] Am I responsible for turning it on?
[00:02:20.120 --> 00:02:21.400] Do I do it with my hands?
[00:02:21.400 --> 00:02:22.680] Do I just look at it?
[00:02:22.680 --> 00:02:24.920] Do I use my elbows?
[00:02:24.920 --> 00:02:27.560] And then you think, now who's responsible for turning it off?
[00:02:27.560 --> 00:02:28.920] Do I just walk away?
[00:02:28.920 --> 00:02:37.880] Or you know, we can't leave the damn things alone because we're so relentlessly, ceaselessly inventive.
[00:02:37.880 --> 00:02:41.560] So we keep on inventing stuff maybe that doesn't need to be reinvented.
[00:02:41.720 --> 00:02:45.800] I would think it's probably time to call a moratorium on bathroom taps.
[00:02:45.800 --> 00:02:51.960] Chairs and bathroom taps aside, Douglas was obsessed with all types of technology.
[00:02:51.960 --> 00:02:57.560] But the specific type of technology that obsessed Douglas above all else was computing.
[00:02:57.560 --> 00:03:03.160] And specifically, what happens when you start to connect computers together on a network?
[00:03:03.160 --> 00:03:09.720] Now, many years ago, I invented a thing called the Hitchhiker's Guide to the Galaxy.
[00:03:10.680 --> 00:03:15.480] I never meant to be a predictive science fiction writer in the mode of, say, Arthur C.
[00:03:15.480 --> 00:03:16.520] Clarke.
[00:03:16.520 --> 00:03:24.360] My reason for inventing it was purely one of narrative necessity, which is that I had lots of extra bits of story that I didn't know what to do with.
[00:03:25.000 --> 00:03:30.840] I didn't know anything at all about technology in those days, and I didn't think twice about any of the issues.
[00:03:30.840 --> 00:03:34.280] So I just made it a bit like things I was already familiar with.
[00:03:34.280 --> 00:03:37.880] It was a bit like a pocket calculator, a bit like a TV remote.
[00:03:37.880 --> 00:03:41.880] It had a little window at the top and lots and lots of little buttons on the front.
[00:03:41.880 --> 00:03:53.920] The Hitchhiker's Guide to the Galaxy, as it manifests itself in the five novels that bear its name, is an anarchic, attitude-filled, user-generated publication.
[00:03:53.920 --> 00:04:01.600] Today, we would call it an app, Wikipedia on ACID, with aggressive social media functionality.
[00:04:01.600 --> 00:04:14.160] But when Douglas Adams first described it in 1979, neither the iPad, nor the internet, nor Wikipedia, nor social media had been invented.
[00:04:14.480 --> 00:04:25.680] One of the strangest things about the blurring between fiction and real life in Douglas's career was not that he predicted technologies that later came to pass.
[00:04:25.680 --> 00:04:30.240] Many science fiction writers, as Douglas pointed out himself, from Arthur C.
[00:04:30.240 --> 00:04:33.040] Clarke to William Gibson, have done that.
[00:04:33.360 --> 00:04:48.160] Where Douglas is unusual is that 20 years after writing a book about the guide, he founded a production company, the Digital Village, with a team that included top technologists and internet pioneers.
[00:04:48.160 --> 00:04:55.840] And then he set about trying to actually build a digital guide to the world in real life.
[00:04:56.160 --> 00:04:59.120] But let us not get ahead of ourselves.
[00:04:59.120 --> 00:05:14.720] To understand how deep and old Douglas's love for technology in general and network computing in particular ran, and the strange role he ended up playing in the history of the internet, we need to start at the very beginning.
[00:05:14.720 --> 00:05:18.880] And who was at the very beginning of Douglas's love affair with computers?
[00:05:18.880 --> 00:05:21.440] Stephen Fry, of course.
[00:05:22.720 --> 00:05:25.840] We started talking about computers.
[00:05:25.840 --> 00:05:30.520] And you have to remember that computers then meant isolated objects on a desk.
[00:05:29.600 --> 00:05:33.160] There was no networking, there was no internet.
[00:05:33.480 --> 00:05:42.760] There was, I mean, there were sort of vague versions of it within universities, but there was no way you could get onto the internet from home.
[00:05:42.760 --> 00:05:48.920] And we both, I think, understood that that was the future of computing.
[00:05:48.920 --> 00:06:02.840] And we both were aware of something that became very important in our lives: that Apple computers, who were big in the 70s, they really kick-started the whole home computing craze.
[00:06:02.840 --> 00:06:06.360] They were bringing out a new and rather unusual sort of computer.
[00:06:06.360 --> 00:06:11.400] We met then, I guess, in 83, maybe it was mid-83, something like that.
[00:06:11.720 --> 00:06:19.080] And we were aware that in January 84, Apple were bringing out a new computer, which they were going to call the Macintosh.
[00:06:19.080 --> 00:06:24.280] And I was able to tell him, which he didn't know, that it was called the Macintosh.
[00:06:24.440 --> 00:06:30.920] Nothing to do with raincoats, but because Macintosh is a kind of apple.
[00:06:30.920 --> 00:06:36.040] It's like a Golden Delicious or a Granny Smith or a Cox's Pippin.
[00:06:36.040 --> 00:06:39.240] But there is the Macintosh apple, very popular in America.
[00:06:39.240 --> 00:06:40.120] Hence, you know.
[00:06:40.120 --> 00:06:41.640] So it's an apple.
[00:06:41.640 --> 00:06:47.880] Anyway, I could get bogged down in all these ridiculous memories that would mean nothing to anybody else.
[00:06:47.880 --> 00:06:52.600] But we were, and this is the story I like to tell, and he told it too.
[00:06:52.600 --> 00:06:59.560] So we were the first two people in Britain, possibly Europe, to buy an Apple Macintosh.
[00:06:59.880 --> 00:07:09.080] It is worth mentioning in passing that for some years, Stephen and Douglas would argue about who was actually first in line at the Mac store that day.
[00:07:09.080 --> 00:07:13.800] Whomever it was, Douglas's love for computers predated the Macintosh.
[00:07:13.800 --> 00:07:17.120] Here he is, describing love at first sight.
[00:07:14.520 --> 00:07:20.800] But I remember the first time I ever saw a computer.
[00:07:21.680 --> 00:07:27.200] It was in Lasky's in the Totten Court Road, a fond memory.
[00:07:27.840 --> 00:07:29.760] And it was a Commodore pet.
[00:07:30.320 --> 00:07:37.120] I don't know if you remember the very, very early one, which is like a sort of pyramid with a little sort of screen, yay big.
[00:07:37.760 --> 00:07:42.000] And I sat and stared at it.
[00:07:42.320 --> 00:07:46.080] And I was absolutely fascinated by it.
[00:07:46.080 --> 00:07:56.080] And I was trying to figure out any reason, any possible reason why this could have any use to me.
[00:07:56.400 --> 00:08:05.200] And I just couldn't see any, and I was thinking as hard as I could, but I couldn't see any way in which a computer could be useful to me because I was a writer.
[00:08:05.200 --> 00:08:08.640] And I just didn't have that much stuff to add up.
[00:08:12.160 --> 00:08:18.800] And that was because I'd got the wrong idea of what a computer was.
[00:08:18.800 --> 00:08:22.400] I thought it was a kind of super adding machine.
[00:08:22.400 --> 00:08:26.880] But everybody thought it was a kind of super adding machine.
[00:08:26.880 --> 00:08:32.320] That was the model we had in our mind of what a computer was, a super adding machine.
[00:08:32.320 --> 00:08:37.440] And so we developed it as an adding machine with a long feature list.
[00:08:37.760 --> 00:08:44.320] And after a while, we begin to think, well, we're now getting very good at adding up and manipulating these numbers.
[00:08:44.320 --> 00:08:45.920] Is there anything else we can do with it?
[00:08:45.920 --> 00:08:49.440] I mean, supposing we made these numbers stand for something else.
[00:08:49.440 --> 00:08:54.720] Like we made it stand for the letters of the alphabet, for ASCII code.
[00:08:54.720 --> 00:09:01.800] And we suddenly think, my God, we have had such limited imagination.
[00:09:00.000 --> 00:09:07.000] We have failed to see the really exciting possibilities of the machine.
[00:09:07.160 --> 00:09:08.840] How short-sighted we've been.
[00:09:08.840 --> 00:09:12.520] It isn't an adding machine, it's a typewriter.
[00:09:12.840 --> 00:09:17.160] So we developed it as a typewriter with a long feature list.
[00:09:17.160 --> 00:09:22.040] And then after a while, we get on a roll and we think, well, what else can we make these numbers stand for?
[00:09:22.040 --> 00:09:26.200] We can make them stand for the elements of a graphic display.
[00:09:27.800 --> 00:09:30.680] And we come up with yet another paradigm shift.
[00:09:30.680 --> 00:09:35.240] We think, my God, how short-sighted of us to think that it was just an adding machine or a typewriter.
[00:09:35.240 --> 00:09:37.400] It's something much more exciting than any of these.
[00:09:37.400 --> 00:09:42.040] It's a television with a typewriter sat in front of it.
[00:09:43.000 --> 00:09:49.720] And then with the invention of the World Wide Web, we go to yet another paradigm shift and we think, my God, it isn't any of these.
[00:09:49.720 --> 00:09:50.680] It isn't any of these.
[00:09:50.680 --> 00:09:51.880] It isn't just a typewriter.
[00:09:51.880 --> 00:09:52.920] It isn't just an adding machine.
[00:09:52.920 --> 00:09:54.360] It isn't a television.
[00:09:54.360 --> 00:09:56.600] It's a brochure.
[00:09:57.240 --> 00:10:06.520] In less than three minutes, almost as a throwaway gag, Douglas sketches out a 20-year intellectual history of personal computing.
[00:10:06.520 --> 00:10:26.440] From adding machine to typewriter to television to brochure, Douglas skewers how at every stage we have had a dramatically limited understanding of how computers were truly transformational, particularly when combined with the power of networked connectivity.
[00:10:27.080 --> 00:10:37.640] The specificity and power of Douglas's vision of what the internet was and what it could be were breathtakingly ahead of his time.
[00:10:37.640 --> 00:10:45.000] Here he is in 2001, speaking to a collection of mobile phone and network technologists and executives.
[00:10:46.480 --> 00:11:29.840] Imagine if every piece of information we ever generated about the world that passed through a computer, whether it's a restaurant typing up its menu for the evening, whether it's a shop maintaining its stock list, whether it's a car noticing what speed it's going, how much petrol it's got left, and where the nearest service stations are, what prices they're charging for petrol are, or whether it's someone measuring the wingspan of an African swallow, or writing down where and when their grandmother was born, or whether it's someone taking a digital photograph from the top of the Great Pyramid, or just a flower that's bloomed late or early this year, or the settings in your thermostat and when it turned on and off.
[00:11:29.840 --> 00:11:34.880] Or if every time you took your child's temperature, the network remembered.
[00:11:35.200 --> 00:11:43.440] Imagine all of that gradually creating a vast shared software model of the world.
[00:11:43.760 --> 00:11:45.520] Just imagine it.
[00:11:47.760 --> 00:11:58.720] The one thing I did get right when I came up with the Hitchhiker's Guide to the Galaxy was that it would know where you were and come up with information appropriately.
[00:11:59.040 --> 00:12:04.960] And that, of course, is the thing that makes the crucial difference to everything in that list above.
[00:12:04.960 --> 00:12:18.320] If every piece of information knew the when of itself and the where of itself, so that the virtual world we created fitted over the real world like an invisible glove.
[00:12:18.640 --> 00:12:32.120] And these devices, these devices that we currently think of as telephones or PDAs, would be the devices that made that invisible world visible to us.
[00:12:32.680 --> 00:12:47.560] Today, in 2025, when we all walk about with geolocated supercomputers in our pockets, what Douglas is describing seems simply an accurate description of the world in which we now live.
[00:12:47.560 --> 00:12:56.600] Back in 2001, it was insane science fiction, except for the companies who were trying to build it.
[00:12:56.600 --> 00:13:12.280] And one of those companies was Douglas's own digital village, where he had assembled an enormously talented group of people to try and make, as he described it, the Earth edition of The Hitchhiker's Guide to the Galaxy.
[00:13:12.280 --> 00:13:14.360] Here's Robbie Stamp.
[00:13:14.680 --> 00:13:29.080] I'd written unbidden a business plan called Cable City, which was a recognition that to take advantage of new cable and satellite opportunities, you needed to create an entirely new cost base for programming.
[00:13:29.080 --> 00:13:35.160] And I was telling Douglas about it, and it had morphed by then into something called the Digital Village.
[00:13:35.160 --> 00:13:39.240] The Marshall McLuhan quote influenced that, the digital village.
[00:13:39.240 --> 00:13:45.400] And I was sitting there with Douglas, you know, several of our meetings in his house, and he said, oh, that sounds interesting.
[00:13:45.400 --> 00:13:46.840] I'd like to be your partner.
[00:13:46.840 --> 00:13:49.000] How much would it cost me to invest?
[00:13:49.000 --> 00:13:52.600] So I plucked a figure out of the air and he said, I'm in.
[00:13:52.600 --> 00:13:53.320] And that was it.
[00:13:53.320 --> 00:14:00.920] We were then 50-50 partners in this wonderful, one of the best adventures of my working life called the Digital Village.
[00:14:01.240 --> 00:14:09.080] So Robbie and Douglas launched this company, The Digital Village, and they recruit almost at once an amazing group of people.
[00:14:09.080 --> 00:14:11.080] And they take the moonshot.
[00:14:11.080 --> 00:14:14.200] They're going to build the Hitchhiker's Guide.
[00:14:14.200 --> 00:14:19.360] Here's Richard Harris, who served as Chief Technology Officer of the Digital Village.
[00:14:19.920 --> 00:14:23.920] Remember, this is the days before the term social network had been actually coined.
[00:14:23.920 --> 00:14:33.280] You know, you had Tripod and MySpace and all of the CompuServe forums at the time, but very much precursors of social networks.
[00:14:33.280 --> 00:14:42.880] And of course, the Hitchhiker's Guide itself was meant to be an opinionated counterpoint to the Encyclopedia Galactica, the old Azimorph construct.
[00:14:42.880 --> 00:14:45.760] But we didn't have an Encyclopedia Galactica at the time.
[00:14:45.760 --> 00:14:50.800] If we were doing it now, we would be building an opinionated counterpoint to Wikipedia.
[00:14:50.800 --> 00:14:54.000] And I would love the challenge of doing that.
[00:14:54.000 --> 00:14:54.720] But we didn't.
[00:14:54.720 --> 00:15:06.640] So we had to come up with a content-driven network of people that created its own content that was self-policing and created trust between people.
[00:15:06.640 --> 00:15:18.880] And what we were all about was delivering people to each other via content and helping them collaboratively build content, which is actually something I think we succeeded on very well with H2G2.
[00:15:19.200 --> 00:15:26.480] Now, there were a lot of people cashing in on the first dot-com boom of the late 90s.
[00:15:26.480 --> 00:15:32.720] Lots of people trying to build social networks and internet sites.
[00:15:32.720 --> 00:15:38.880] But note the specificity of Douglas's vision as described by Harris.
[00:15:39.520 --> 00:15:45.840] We were never in the business of doing a Facebook and simply delivering eyeballs to advertisers.
[00:15:46.160 --> 00:15:52.240] That's the fundamental contradiction in most social networks: they're not about supporting people.
[00:15:52.240 --> 00:15:53.920] They're not about building community.
[00:15:53.920 --> 00:15:57.680] They're about selling that community to paying advertisers.
[00:15:57.680 --> 00:16:00.600] And I think that's a fundamentally flawed model.
[00:15:59.840 --> 00:16:05.080] You see systems that can see what's happening and adapt to them in real time.
[00:16:05.720 --> 00:16:13.320] And you need to give people not just what they knew to ask for, but what they needed to know but didn't know to ask for.
[00:16:13.320 --> 00:16:24.280] And that was, if you like, a fundamental strapline for our development of the real hitchhiker's guide: if you're giving people what they ask for, you're just a search engine.
[00:16:24.280 --> 00:16:30.680] You may be a sophisticated search engine, you may be a constrained, community-driven search engine, but you're still a search engine.
[00:16:30.680 --> 00:16:39.320] But you need to be building an understanding of people sufficient to let them find the stuff that they didn't know to ask for.
[00:16:39.320 --> 00:16:51.080] Part of our challenge was actually: okay, how do we build a knowledge framework that profiles people for their own benefit, not for advertisers' benefit, and helps put them in touch with who and what they needed to know.
[00:16:51.080 --> 00:16:58.120] So it's people to people, people to content, people and content to place, putting all of those together.
[00:16:58.120 --> 00:17:05.160] Now, you might suspect that there is an element of hindsight at work here in Richard's description.
[00:17:05.160 --> 00:17:15.320] But I have the advantage of actually having been in the room when h2g2.com was being planned and built.
[00:17:15.320 --> 00:17:30.360] And I can attest that this thoughtfulness about the dangers of what social media could become and the importance of editors, fact-checkers, and community self-regulation were built into its early design.
[00:17:30.360 --> 00:17:40.200] Those insights came not only from the official technologists and scientists at the company, but often intuitively from Douglas himself.
[00:17:40.520 --> 00:17:42.280] Here's Robbie Stamp again.
[00:17:42.600 --> 00:17:44.520] Everything is connected.
[00:17:44.800 --> 00:17:53.200] I think that deep sense at his core, I think so much of what he explores creatively through almost all of his work is that idea.
[00:17:53.200 --> 00:17:56.960] Everything is connected and everything is perspectival.
[00:17:57.280 --> 00:18:11.280] And I think he saw in the internet an emergent technology which was going to sort of in a way make, if I can use this, make flesh that interconnectedness of things.
[00:18:11.920 --> 00:18:20.960] It were millions and millions and millions of nodes which were going to be connecting in new ways, some of which we could maybe predict and many we couldn't.
[00:18:20.960 --> 00:18:41.920] And I think he found that emergent property intellectually huge and exciting because I think it was an enormous river flowing into his creativity, which was maybe one of the most important whirlsprings in my mind was that natural fascination in perspective and in connectivity.
[00:18:41.920 --> 00:18:55.440] And I think that the internet was there, he just saw it very, very, very early on as this massive playground where so much was going to be possible that hadn't been possible going through traditional gatekeepers.
[00:18:56.160 --> 00:19:06.480] But I think Douglas, when I look back to those early conversations, was phenomenally prescient about an awful lot of the ways in which the internet duly has evolved.
[00:19:06.480 --> 00:19:21.680] I think maybe some of the wilder, nastier aspects or the more difficult aspects of what we've learned about social media and the internet, he kind of died before we'd really seen a lot of that.
[00:19:21.680 --> 00:19:34.040] And I think before, you know, the internet found its business model, which was turning human beings into financial derivative products who were sliced, diced, and sold, and resold, and sold in these massive advertising trading marketary.
[00:19:34.600 --> 00:19:41.080] So I think he was there in the early optimistic, more wide-eyed days of what was going to be possible.
[00:19:41.080 --> 00:19:47.880] But I think it was that freedom and that recognition of the interconnectedness of everything that thrilled him.
[00:19:48.200 --> 00:20:01.080] This idea, the idea that everything is connected, is one that Douglas rode to logical but absurd extremes in his Dirk Gently's Holistic Detective Agency books.
[00:20:01.720 --> 00:20:07.880] The term holistic refers to my conviction that what we are concerned with here is the fundamental interconnectedness of all things.
[00:20:08.040 --> 00:20:14.120] I do not concern myself with such petty things as fingerprint powder, telltale pieces of pocket fluff, and inane footprints.
[00:20:14.120 --> 00:20:18.280] I see the solution to each problem as being detectable in the pattern and web of the whole.
[00:20:18.280 --> 00:20:26.280] The connections between causes and effects are often much more subtle and complex than we, with our rough and ready understanding of the physical world, might naturally suppose.
[00:20:27.240 --> 00:20:32.920] Dirk Gently is a detective with a sort of superpower.
[00:20:32.920 --> 00:20:40.280] He has a semi-psychic ability to sense the deep interconnections between all things.
[00:20:40.600 --> 00:20:48.360] He intuits which people, which events, which locations are relevant to his current case.
[00:20:48.360 --> 00:20:56.040] And his methods, whilst unconventional, are normally successful or semi-successful.
[00:20:56.040 --> 00:21:00.440] He solves cases with questionable efficiency.
[00:21:00.440 --> 00:21:13.800] The great irony of Dirk Gently as a character is that this hero with a gift for sensing connections is himself profoundly bad at making human connections.
[00:21:13.800 --> 00:21:18.720] And he spends most of his life alone and lonely.
[00:21:19.040 --> 00:21:33.360] Without wanting to post-mortem psychoanalyze Douglas Adams, it is perhaps worth asking why he was so excited to become part of the Digital Village and to run a production company.
[00:21:33.360 --> 00:21:41.360] Here's Ian Charles Stewart, co-founder of Wyatt Magazine and another of Douglas' partners at the Digital Village.
[00:21:41.360 --> 00:21:44.640] But the main reason we got together wasn't because there was a single driven purpose.
[00:21:44.640 --> 00:21:46.720] It was because he had writer's block.
[00:21:46.720 --> 00:21:56.880] And the challenge was how to create a team around him that allowed creativity to re-emerge for him to be able to continue telling stories in a different way.
[00:21:56.880 --> 00:22:04.560] And so the idea was to create a platform which he could use to tell stories in different media, in new media.
[00:22:04.560 --> 00:22:09.600] The first, of course, was the computer game, Starship Titanic, and we created a team around that.
[00:22:09.600 --> 00:22:11.840] And then the second was thinking about the film.
[00:22:11.840 --> 00:22:25.120] And the notion was that if we gave him, if we took him away from the typewriter and paper and put him in front of new tools with other people, that that might become something which he could get his teeth into and have fun with.
[00:22:25.120 --> 00:22:27.040] And it turned out to be the case.
[00:22:28.000 --> 00:22:31.840] Sophie Aston was Douglas' assistant at the Digital Village.
[00:22:31.840 --> 00:22:38.480] Sophie and I first met when we were both in our early 20s and excited to be around the great Douglas Adams.
[00:22:38.480 --> 00:22:45.120] Here she is recollecting what it was like to have Douglas in an office.
[00:22:45.120 --> 00:23:02.040] He was such a, you know, a social being, a social animal, that actually what he was really enjoying in those last few years was working in an office environment with lots of interesting, young, creative, brilliant people on Starship Titanic, on HTG2.
[00:22:59.760 --> 00:23:05.560] That was what gave, that's what fed him, you know, the excitement.
[00:23:05.800 --> 00:23:08.200] Yeah, he was in the office a lot of the time.
[00:23:08.200 --> 00:23:12.200] He probably didn't need to be, but he really enjoyed spending time in our company, I think.
[00:23:12.520 --> 00:23:22.840] What's crucial in this story is to realize that unlike the leaders of a lot of other internet companies, Douglas Adams wasn't just a technologist.
[00:23:22.840 --> 00:23:27.880] He was both a technologist and a world-class creative.
[00:23:27.880 --> 00:23:32.680] And he was also someone deeply interested in human connection.
[00:23:32.680 --> 00:23:42.520] That is to say, what really interested him was how networked technology could be life-enhancing to humans and to our relationships.
[00:23:42.520 --> 00:23:49.960] Here he is in the year 2000, giving another keynote address, this time to a group of mobile phone executives.
[00:23:49.960 --> 00:23:54.760] He imagines a world of virtual reality and online dating.
[00:23:55.080 --> 00:24:03.480] Note that he uses the phrase soft world to denote what we would probably call the digital or virtual world today.
[00:24:04.440 --> 00:24:11.000] In the real world, if you want to know what the view from the top of the Eiffel Tower looks like, you have to go there.
[00:24:11.000 --> 00:24:17.240] In the soft world, you can ask your guide to show you what it looks like from the top of the Eiffel Tower.
[00:24:17.240 --> 00:24:21.080] Maybe it can show you the view through a webcam there instantly.
[00:24:21.080 --> 00:24:31.720] Or maybe it can construct a virtual reality view for you in real time based on everything it knows about everything that can currently be seen from the top of the Eiffel Tower.
[00:24:31.720 --> 00:24:35.240] So the first thing you would see is a lot of virtual graffiti.
[00:24:35.240 --> 00:24:39.160] Some of it quite naughty being French, and you just sort of wave it away.
[00:24:39.160 --> 00:24:47.200] And as you look down at your virtual view, you ask it how many of the cars you can see from your current vantage point, for instance, are British.
[00:24:44.840 --> 00:24:49.200] For a moment, you think it's not working.
[00:24:49.840 --> 00:24:53.920] Then you say, oh, okay, okay, so how many of them are Peugeots?
[00:24:53.920 --> 00:24:57.120] And the view lights up with thousands of moving dots.
[00:24:57.120 --> 00:25:02.960] Okay, how many of the cars you can currently see have some Bach playing on the in-car stereos?
[00:25:02.960 --> 00:25:04.400] A few dozen.
[00:25:04.400 --> 00:25:08.240] Oh, and there's one playing your favorite recording of the Schubler Preludes.
[00:25:08.240 --> 00:25:09.760] Do they have their flag up?
[00:25:09.760 --> 00:25:14.720] Yes, she'll talk to you, but only because the only thing you asked her about was what she was listening to.
[00:25:14.720 --> 00:25:17.280] Anything else and her flag would have been down for you.
[00:25:17.280 --> 00:25:21.120] You chat for a bit about the music and quickly discover a tremendous rapport.
[00:25:21.120 --> 00:25:22.720] What about having dinner together?
[00:25:22.720 --> 00:25:26.640] Okay, but she has a gluten problem, which restricts where she can eat.
[00:25:26.640 --> 00:25:28.240] And you like turbot.
[00:25:28.240 --> 00:25:31.040] So a couple of restaurants light up in your view.
[00:25:31.040 --> 00:25:35.440] One of them looks great for a romantic tryst, lots of alcoves and dim lighting.
[00:25:35.440 --> 00:25:47.600] But some of the people who have eaten there tonight have left notes around the place saying, left around in cyberspace, saying that they're obviously understaffed in the kitchen tonight and the food has been coming out cold or reheated.
[00:25:47.600 --> 00:25:51.120] The other place gets raves about the food, but it's a bit bright and noisy.
[00:25:51.120 --> 00:25:53.280] You decide that good food is the thing to go for.
[00:25:53.280 --> 00:25:59.040] Then you remember, damn, you're not actually in Paris, you're in New Delhi and got a bit carried away.
[00:26:00.320 --> 00:26:01.840] That's all right, says your new friend.
[00:26:01.840 --> 00:26:03.200] I'm actually in Albuquerque.
[00:26:03.200 --> 00:26:05.040] I'm a music squatter.
[00:26:05.360 --> 00:26:06.320] What does that mean?
[00:26:06.320 --> 00:26:10.640] Well, she just monitors the network for anybody who's looking for someone who's listening to that recording.
[00:26:10.640 --> 00:26:17.040] The real occupant of the car had his flag down and wasn't talking to anybody, so she just intercepted your query and really quite enjoyed talking to you.
[00:26:17.040 --> 00:26:18.800] And now she's going out to dinner locally.
[00:26:18.800 --> 00:26:19.120] Thanks.
[00:26:19.120 --> 00:26:19.840] Bye.
[00:26:20.800 --> 00:26:25.840] So you track which restaurant she's going to and send a margarita to her table to say thanks.
[00:26:25.840 --> 00:26:29.120] But she's annoyed that you tracked her down and turns the connection off.
[00:26:29.120 --> 00:26:33.000] Oh well, you go back to your day job erecting advertising hoardings on Mars.
[00:26:33.480 --> 00:26:37.720] Soft Mars, that is, which has recently been added to the soft solar system.
[00:26:39.320 --> 00:26:52.600] It is archetypal Douglas Adams to imagine that the highest calling of the internet would be to enable a failed romantic tryst between two lonely bach enthusiasts in Paris.
[00:26:52.920 --> 00:27:06.040] But if he foresaw internet romance, Douglas also predicted catfishing and even more serious problems that a free-for-all information error would usher into our world.
[00:27:06.360 --> 00:27:13.640] Here is an extract from an article that Douglas wrote in 1999 about the issues of online trust.
[00:27:13.640 --> 00:27:16.680] Read once again by Samuel Barnett.
[00:27:17.320 --> 00:27:22.040] The internet is so new, we still don't really understand what it is.
[00:27:22.040 --> 00:27:27.320] We mistake it for a type of publishing or broadcasting because that's what we're used to.
[00:27:27.320 --> 00:27:36.680] So people complain that there's a lot of rubbish online or that it's dominated by Americans or that you can't necessarily trust what you read on the web.
[00:27:36.680 --> 00:27:41.240] Imagine trying to apply any of those criticisms to what you hear on the telephone.
[00:27:41.240 --> 00:27:50.440] Of course you can't trust what people tell you on the web any more than you can trust what people tell you on megaphones, postcards, or in restaurants.
[00:27:50.440 --> 00:28:00.440] Working out the social politics of who you can trust and why is quite literally what a very large part of our brain has evolved to do.
[00:28:00.760 --> 00:28:17.200] For some batty reason, we turn off this natural scepticism when we see things in any medium which require a lot of work or resources to work in, or in which we can't easily answer back, like newspapers, television, or granite, hence carved in stone.
[00:28:17.840 --> 00:28:23.040] What should concern us is not that we can't take what we read on the internet on trust.
[00:28:23.040 --> 00:28:24.320] Of course, you can't.
[00:28:24.320 --> 00:28:26.000] It's just people talking.
[00:28:26.000 --> 00:28:32.240] But that we ever got into the dangerous habit of believing what we read in the newspapers or saw on the TV.
[00:28:32.240 --> 00:28:37.280] A mistake that no one who has met an actual journalist would ever make.
[00:28:37.920 --> 00:28:44.640] One of the most important things you learn from the internet is that there is no them out there.
[00:28:44.640 --> 00:28:47.040] It's just an awful lot of us.
[00:28:48.000 --> 00:28:56.720] Those of us who remember the early days of social media in the heady 2000s remember it as a happy place.
[00:28:56.720 --> 00:29:07.280] Twitter was this endless tea party where celebrities were your friends and there was an ongoing, kind, intellectual, collaborative conversation.
[00:29:07.280 --> 00:29:14.960] As Douglas says, there was no them, just an awful lot of the best of us.
[00:29:14.960 --> 00:29:26.160] Early Twitter, perhaps, gave a glimpse of the sort of social media that Douglas Adams wanted to build, the guide's vision of the internet.
[00:29:26.160 --> 00:29:28.800] We didn't get that, of course.
[00:29:28.800 --> 00:29:35.760] Instead, we've ended up with an internet in which profit-reaping algorithms prioritize clickbait.
[00:29:35.760 --> 00:29:41.840] That is to say, they prioritize division, anger, and fear.
[00:29:42.160 --> 00:29:53.040] Instead of getting a place where minds can meet, we are being driven to our separate echo chambers and increasingly our separate social networks.
[00:29:53.040 --> 00:30:06.280] We are seeing increasing evidence that social media is a mental health threat, particularly for the young, and a profound threat to social cohesion on the national level.
[00:30:06.280 --> 00:30:09.000] None of this was the plan.
[00:30:09.000 --> 00:30:18.520] Where we have ended up is a sorry, dark shadow of what we had all hoped for at the dawn of the internet.
[00:30:18.520 --> 00:30:21.240] What would Douglas make of all this?
[00:30:21.240 --> 00:30:37.560] We can't know the answer, of course, but perhaps as a proxy, we could ask the people who worked alongside him in those early glory days, the people who strived alongside him, trying to build the kind of internet he wanted to see.
[00:30:37.560 --> 00:30:39.160] Here's Richard Harris.
[00:30:39.480 --> 00:30:48.280] Most social networks, they're not about supporting people, they're not about building community, they're about selling that community to paying advertisers.
[00:30:48.280 --> 00:30:59.240] And I think that's a fundamentally flawed model, as we're seeing in extremists with what's happened to Twitter now and to a large extent what's happening with Facebook.
[00:30:59.240 --> 00:31:05.560] Here's Wired co-founder Ian Charles Stewart with a slightly more nuanced view.
[00:31:05.880 --> 00:31:11.960] First of all, it's easy to beat social media up these days because everybody does it and everybody complains about it and everybody uses it as an excuse for everything.
[00:31:11.960 --> 00:31:20.120] It is nonetheless the way a whole bunch of new people have generated careers for themselves, have been able to get access to communities they would never have found otherwise.
[00:31:20.120 --> 00:31:24.200] The whole notion of long-tail communities is real because of social media.
[00:31:24.200 --> 00:31:31.720] If you cared about something but lived in rural New Zealand in the South Island and had no one else who cared about what you cared about on the internet, you can find them, and social media allows you to do that.
[00:31:31.720 --> 00:31:35.880] So we shouldn't forget the good bits as we complain about the bad bits.
[00:31:35.880 --> 00:31:53.760] But yes, I think we have at this current stage of development, and it's true for all technologies, we have people who are comfortable and feel like they're riding the wave, and then we have those people that feel like every time they try to put their head above water, they're crushed by the wave.
[00:31:53.760 --> 00:31:59.440] Those people tend to feel disenfranchised, they tend to feel angry, and therefore you end up with Twitter commentaries and trolls.
[00:31:59.440 --> 00:32:02.000] Same is true on your YouTube channel, same is true on all media.
[00:32:02.000 --> 00:32:13.280] And I think it's not easy to carry everybody along at the same pace when the world is being so rapidly changed in so many different ways.
[00:32:13.280 --> 00:32:14.960] Social media is just one part of that.
[00:32:14.960 --> 00:32:18.640] I think it's, if anything, it's a microcosm of what's happening in society more broadly.
[00:32:18.640 --> 00:32:22.160] I do think it points to governance challenges.
[00:32:22.160 --> 00:32:32.640] The question as to whether it's the nasty corporates or it's the misleading governments that are most to blame, I think is an open one.
[00:32:32.640 --> 00:32:41.600] The veneer of respectability that's assigned to governments because we theoretically vote for them, I don't think is all that it's cracked up to me.
[00:32:41.600 --> 00:32:52.080] We only have to look at voting in Venezuela, France, Austria, or heavens help us, the US, to question that veneer of respectability.
[00:32:52.080 --> 00:33:00.240] So I don't blame CEOs of big companies, nor blame necessarily heads of government.
[00:33:00.240 --> 00:33:01.360] I think it's a societal thing.
[00:33:01.360 --> 00:33:03.440] We just have to manage our way through.
[00:33:03.440 --> 00:33:09.120] And I think I don't mind, as a member of that society, where that help comes from or where that guidance comes from.
[00:33:09.120 --> 00:33:14.960] And I think sometimes the humor and wit of someone like Douglas is helpful in helping guide people through it.
[00:33:14.960 --> 00:33:24.240] James Goss, the novelist and my old school friend, in 1996 built the first website for a theatre show in the UK.
[00:33:24.240 --> 00:33:32.440] Five years later, James, for the BBC, produced the first webcast of a memorial service in internet history.
[00:33:32.760 --> 00:33:38.920] The memorial service in question was sadly and inevitably that of Douglas Adams.
[00:33:38.920 --> 00:33:44.440] My point is, James Goss, like Douglas, is an early adopter of technology.
[00:33:44.760 --> 00:34:01.640] I really think Douglas would have been amazing on social media, whilst also being the first person to point out that pouring all of your heart and soul and details of your entire life out there is a very stupid thing to do, and also none of it has any meaning.
[00:34:01.640 --> 00:34:04.040] You know, he would have been the first person to diagnose this.
[00:34:04.360 --> 00:34:06.200] And here's Stephen Fry.
[00:34:06.200 --> 00:34:11.880] Stephen, at one point in the early 2000s, had the biggest Twitter following in the world.
[00:34:11.880 --> 00:34:15.720] He has since withdrawn from all social media.
[00:34:16.040 --> 00:34:17.000] How do I put it?
[00:34:17.000 --> 00:34:19.400] I won't say he was fortunate in dying.
[00:34:19.400 --> 00:34:31.400] I mean, his mind when he died had really been unpolluted by what happened to the internet and by the invention of social media and then what happened to that.
[00:34:31.720 --> 00:34:36.200] And he would, of course, have been angry and disappointed and upset, as we all were.
[00:34:36.200 --> 00:34:41.080] And he predicted, yes, you could say he predicted the iPad.
[00:34:41.080 --> 00:34:47.320] You could say he predicted all kinds of ways that you could interface with technology.
[00:34:47.320 --> 00:35:03.480] But he didn't predict in a strange way the most important and terrible fact of technology is that it was still in the hands of these gibbering apes that are human beings and that they would despoil it and that it wouldn't improve them.
[00:35:03.480 --> 00:35:14.680] They would inshitify, to use Kori Doktorov's wonderful word, the inshitification of the internet, you know, and so he didn't live to see the inshitification.
[00:35:15.200 --> 00:35:29.760] When he died, it was still a place of tremendous optimism, and the barriers were being broken down, and frontiers, and disputes, and disagreements of longstanding were melting away, and all was going to be solved by the glory.
[00:35:29.760 --> 00:35:37.280] And what he didn't predict was that, far from making humans nicer, it seems to have made us nastier.
[00:35:37.280 --> 00:35:38.000] You know what I mean?
[00:35:38.000 --> 00:35:50.320] I mean, he started with a view of the lumpen bureaucracy in the shape of Vogons, for example, and the vagueness and the hopeless kind of unreliability of scientists and others.
[00:35:50.320 --> 00:36:02.240] And there's no real malevolence there of the kind that every day despoils our culture with lies and unpleasantness.
[00:36:02.560 --> 00:36:03.840] Quick footnote.
[00:36:03.840 --> 00:36:11.280] My editor has requested that I explain Corey Doctorow's concept of the enshittification of the Internet.
[00:36:11.280 --> 00:36:19.440] I think it's one of those words that actually explains itself rather well: enschittification.
[00:36:19.760 --> 00:36:24.480] The enshittification of the Internet.
[00:36:24.480 --> 00:36:26.000] Moving on.
[00:36:26.000 --> 00:36:50.080] Whilst Douglas didn't live to see it happen, I actually suspect that as an endless worrier, Douglas did fear that the technology he loved might, in the wrong hands and used with the wrong motivations, have terrible consequences and would likely, as Stephen Fry puts it, despoil our culture with lies and unpleasantness.
[00:36:50.400 --> 00:36:57.680] Exhibit one in this argument is Mostly harmless, the fifth and final Hitchhiker's book.
[00:36:57.680 --> 00:37:15.480] It's a uniquely bleak book in his canon, the darkest by far of the series, ending with the unambiguous death of all of the much-loved characters: Arthur, Ford, Trillian, Zaphod, and the rest.
[00:37:15.800 --> 00:37:22.120] What is interesting, though, in this context is how and why they die.
[00:37:22.120 --> 00:37:30.760] The guide itself, as we've established, is a pretty good proxy for the internet, a user-generated source of information.
[00:37:30.760 --> 00:37:35.160] The guide is a cross between Wikipedia and a social media platform.
[00:37:35.160 --> 00:37:45.880] Even in the first Hitchhiker book, Douglas hints at some of the problems of a crowd-sourced repository of opinion masquerading as fact.
[00:37:46.840 --> 00:37:55.720] The Hitchhiker's Guide to the Galaxy is an indispensable companion to all those who are keen to make sense of life in an infinitely complex and confusing universe.
[00:37:55.720 --> 00:38:06.680] For though it cannot hope to be useful or informative on all matters, it does at least make the reassuring claim that where it is inaccurate, it is at least definitively inaccurate.
[00:38:06.680 --> 00:38:11.640] In cases of major discrepancy, it's always reality that's got it wrong.
[00:38:11.640 --> 00:38:16.440] This was the gist of the notice: it said, The guide is definitive.
[00:38:16.440 --> 00:38:19.560] Reality is frequently inaccurate.
[00:38:20.200 --> 00:38:24.360] And that's in the first book in 1979.
[00:38:24.360 --> 00:38:41.000] By the time we get to Mostly Harmless, in 1992, the guide has been taken over by a ruthless corporation, Infinidum Enterprises, who plan to use it exclusively for profit and to eliminate all competition.
[00:38:41.000 --> 00:38:44.760] The real-life parallels here scarcely need pointing out.
[00:38:45.120 --> 00:38:55.600] Infinidum uses sophisticated surveillance, data collection, and an autonomous AI to track and manipulate its users.
[00:38:55.600 --> 00:38:59.680] It defiantly asserts its authority over real life.
[00:38:59.680 --> 00:39:02.400] The guide is definitive.
[00:39:02.400 --> 00:39:05.760] Reality is definitely inaccurate.
[00:39:05.760 --> 00:39:23.680] For example, when the Earth stubbornly insists on existing, despite the guide entry on it categorically stating that it had been demolished, the Guide Sentient AI goes about an elaborate scheme to ensure that reality conforms to its description.
[00:39:23.680 --> 00:39:33.840] To do this, it has to destroy not only the Earth, but all Earths in all realities, once and forever.
[00:39:33.840 --> 00:39:50.240] In the final sentences of the book, Arthur Dent realizes what is about to happen: that he, his friends, his daughter, and everyone he has ever loved are about to be destroyed forever.
[00:39:50.880 --> 00:39:56.080] Things began slowly to reassemble themselves in Arthur's mind.
[00:39:56.400 --> 00:40:01.600] He wondered what he should do, but he only wondered it idly.
[00:40:01.920 --> 00:40:05.760] Around him, people were beginning to rush and shout a lot.
[00:40:06.080 --> 00:40:10.720] But it was suddenly very clear to him that there was nothing to be done.
[00:40:10.720 --> 00:40:12.880] Not now or ever.
[00:40:13.520 --> 00:40:22.480] Through the new strangeness of noise and light, he could just make out the shape of Ford Prefect sitting back and laughing wildly.
[00:40:23.440 --> 00:40:27.120] A tremendous feeling of peace came over him.
[00:40:27.760 --> 00:40:38.360] He knew that at last, for once and forever, it was now all finally over.
[00:40:39.640 --> 00:41:01.480] Hearing that today from the perspective of a society which has had its reality terribly distorted by the enchitification of social media, it's impossible not to see it as a very, very bleak satire on a world that had not yet been born when it was written.
[00:41:01.480 --> 00:41:12.520] Douglas always regretted that his final Hitchhiker book was so bleak, and he planned to go back to the series one day and write a happier ending.
[00:41:12.520 --> 00:41:23.240] Sadly, he never got round to doing that, in the same way that he didn't get to stick around and help guide the evolution of social media.
[00:41:23.240 --> 00:41:39.800] So let's give Douglas the last word in this chapter as he looks to the future, to the guide to the internet, the internet that we didn't get, but the internet that the inventor of The Hitchhiker's Guide to the Galaxy had hoped to build.
[00:41:40.120 --> 00:41:51.080] This is an idea I've been pursuing for a while, and since it grew out of the Hitchhiker's Guide to the Galaxy, I called it H2G2 and started it out as a community website.
[00:41:51.080 --> 00:41:59.160] A community that is of voluntary researchers starting to build the very guide that they would then be able to use.
[00:41:59.160 --> 00:42:01.400] A collaboratively built guide.
[00:42:01.400 --> 00:42:09.480] It's in its infancy, though it has already built up a hugely enthusiastic group of researchers pouring stuff into it.
[00:42:09.480 --> 00:42:18.560] Before we could even get a couple of steps along the way of building the kind of infrastructure we needed to make the thing start to self-organize and self-propagate, guess what?
[00:42:18.560 --> 00:42:23.440] Like every other website on the planet, we ran out of resources or money, as we call it.
[00:42:23.440 --> 00:42:35.680] I hope that one day we can begin to form the basis of something that brings my original vision of The Hitchhiker's Guide to the Galaxy as a mobile, personal, collaborative guide to real life.
[00:42:37.280 --> 00:42:43.680] Our AI Grok is modeled after The Hitchhiker's Guide to the Galaxy, which is one of my favorite books.
[00:42:43.680 --> 00:42:47.360] It's a book on philosophy, disguised as a book on humor.
[00:42:47.360 --> 00:43:02.800] Actually, that forms the basis of my philosophy, which is that we don't know the meaning of life, but the more we can expand the scope of scale of consciousness, digital and biological, the more we're able to understand what questions to ask about the answer.
[00:43:02.800 --> 00:43:04.720] That is the universe.
[00:43:05.040 --> 00:43:08.080] So I have a philosophy of curiosity.
[00:43:08.080 --> 00:43:10.160] But I'm afraid I lied.
[00:43:10.160 --> 00:43:13.360] We can't give Douglas the last word.
[00:43:13.360 --> 00:43:24.160] Because it turns out that someone else has come along with their own vision of what an internet based on The Hitchhiker's Guide to the Galaxy would look like.
[00:43:24.480 --> 00:43:31.440] There is no question that Elon Musk is a major figure in this moment of global history.
[00:43:31.440 --> 00:43:37.040] He has done more to speed the transition to electric cars than anyone alive.
[00:43:37.040 --> 00:43:44.480] And through SpaceX, he has pushed the boundaries of space exploration further than ever before.
[00:43:44.480 --> 00:43:52.400] If we actually get to hitchhike the galaxy in our lifetimes, it will largely be because of Elon Musk.
[00:43:52.720 --> 00:44:12.840] And yet, his reinvention of Twitter as an organ of right-wing politics, his championing and partnership with Donald Trump, and his reverse takeover and defenestration of the American government all give one, to put it as neutrally as possible, pause.
[00:44:13.160 --> 00:44:21.480] Now, on the one hand, it is hardly surprising that a tech bro like Elon is a fan of Hitchhiker.
[00:44:21.480 --> 00:44:25.160] But his admiration goes beyond the casual.
[00:44:25.160 --> 00:44:32.920] Elon Musk sent a plaque reading, Don't Panic, into space strapped onto his Tesla Roadster.
[00:44:32.920 --> 00:44:44.280] As he's just told us, he based his AI engine on the guide, and he plans to name one of his Mars settlement ships the Heart of Gold.
[00:44:44.280 --> 00:44:58.440] Even more than any of that, though, Elon talks about Douglas the same way I talk about Douglas, as a profound thinker whose contributions go beyond the realms of comedy.
[00:44:58.440 --> 00:45:04.760] So I find myself puzzling about what exactly Elon sees in Douglas Adams.
[00:45:04.760 --> 00:45:07.560] Is Elon seeing the same thing I see?
[00:45:07.880 --> 00:45:15.080] Here are some other points of view from other Douglas friends and fans, starting with James Goss.
[00:45:15.080 --> 00:45:19.880] If Douglas was alive today, he would have taken great joy in blocking Elon on Twitter.
[00:45:19.880 --> 00:45:26.360] That would have been the thunderclap that just resounds around the world, and it would have happened like three years before the rest of us got there.
[00:45:26.680 --> 00:45:35.000] Emma Westercoot is an associate professor of game design at OCAD, Canada's largest art and design university.
[00:45:35.000 --> 00:45:41.720] She's as thoughtful a person on gaming, social media, and interactivity as you will ever meet.
[00:45:41.720 --> 00:45:46.640] And she started her career working with Douglas at the Digital Village.
[00:45:47.200 --> 00:46:01.680] And yet, the sort of tech bros in Silicon Valley have taken it as a to-do list without actually having the reflexivity to sort of see that it may have been a critique or a warning.
[00:46:01.680 --> 00:46:03.760] So I think that's interesting.
[00:46:04.160 --> 00:46:15.360] I think that when Elon Musk sent Don't Panic Up Into Space or whatever he did, I heard Douglas sort of shift in his grave sort of thing.
[00:46:15.360 --> 00:46:22.080] Max Landis, the screenwriter and comic book author, is as big a Douglas fan as anyone alive.
[00:46:22.080 --> 00:46:25.760] We collaborated on the Gently Netflix series.
[00:46:25.760 --> 00:46:32.720] Max is himself a big and sometimes controversial character with a pretty healthy ego.
[00:46:32.720 --> 00:46:39.120] So I wondered what he would make of Elon sharing our love for Douglas.
[00:46:39.760 --> 00:46:45.040] Elon, what a fascinating evolution of a public figure.
[00:46:45.360 --> 00:46:51.040] Like what a transformation, an unself-conscious transformation into a villain.
[00:46:51.680 --> 00:47:12.720] And I think certain people are cushioned both by their neurological makeup and their parenting and the $400 billion behind them from having to look at the world in a realistic way.
[00:47:12.720 --> 00:47:19.600] And I think Douglas appeals to people who want to think they know a little better because Douglas writes like he knows a little better.
[00:47:19.600 --> 00:47:23.360] He writes like he kind of gets it a little more than you do.
[00:47:23.360 --> 00:47:36.760] However, a lot of times, what people miss, and when you talk about Musk, what I think someone like Musk misses is that ultimately, Hitchhiker's Guide, one anyway, is fucking about Arthur.
[00:47:37.080 --> 00:47:40.840] You're really locked in in his experience in that book.
[00:47:40.840 --> 00:47:46.120] It's about a normal guy who is a victim of systems and institutions.
[00:47:46.120 --> 00:47:55.720] And if you are someone like Elon Musk, who exists at the highest spectrum of human power, he does, he can have anything he wants.
[00:47:55.720 --> 00:47:58.760] I think you want to believe you deserve that position.
[00:47:58.760 --> 00:48:00.760] You have to believe you know better.
[00:48:00.760 --> 00:48:02.680] So I think, does that make sense?
[00:48:02.680 --> 00:48:07.880] It's almost like, it's almost like in his head, he turns Douglas Adams into Ann Rand.
[00:48:08.520 --> 00:48:14.440] The problem, I mean, the problem with Elon, if we're honest, is he's a Zaphod, but he thinks he's a Ford.
[00:48:14.440 --> 00:48:15.560] Yeah, it's real.
[00:48:16.520 --> 00:48:18.760] You know, when you meet a Ford.
[00:48:18.760 --> 00:48:22.520] So, like, it, and it's not Elon, Elon Zaphod.
[00:48:22.520 --> 00:48:23.480] He would have another head.
[00:48:23.480 --> 00:48:25.640] His kid is named like X23.
[00:48:25.640 --> 00:48:28.520] Like, he would do whatever it took.
[00:48:28.520 --> 00:48:40.600] And I believe if you showed Trump the machine that shows you your size in the universe, the you are here machine, he would have exactly the same reaction as Zaphod.
[00:48:40.600 --> 00:48:43.320] Ah, it zoomed in on me.
[00:48:43.640 --> 00:48:56.760] The image of Elon Musk and Donald Trump sitting in the total perspective vortex is perhaps the best way to give Douglas the last word in this chapter.
[00:49:04.680 --> 00:49:11.160] That was a preview of the new audiobook, Douglas Adams: The Ends of the Earth, from Pushkin Industries.
[00:49:11.160 --> 00:49:21.840] Get Douglas Adams The Ends of the Earth now at Audible, Spotify, pushkin.fm/slash audiobooks, or wherever audiobooks are sold.
Prompt 2: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 3: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Full Transcript
[00:00:03.120 --> 00:00:04.320] Hey, it's Brian.
[00:00:04.320 --> 00:00:10.720] Today I'm doing things a little differently and bringing you a preview of a new audiobook I think you'll really like.
[00:00:10.720 --> 00:00:23.840] It's called Douglas Adams The Ends of the Earth, and it celebrates the wit and wisdom of the legendary science fiction author of Hitchhiker's Guide to the Galaxy and Dirk Gently's Holistic Detective Agency.
[00:00:23.840 --> 00:00:35.040] Douglas Adams was someone who thought deeply about the biggest problems in the world, from the internet to artificial intelligence to space exploration, politics, and conservation.
[00:00:35.040 --> 00:00:42.800] He was a sharp critic and a profoundly disruptive thinker of the way we do things, just like us here at Skeptoid.
[00:00:42.800 --> 00:00:53.440] We've often covered the many ways technology has been shaped and shifted over time, so we think you'll enjoy hearing Adams' thoughts on tech in this preview.
[00:00:53.440 --> 00:01:01.280] It's a journey into the mind of a man who foresaw the technological age in all its wonder and terror.
[00:01:01.600 --> 00:01:03.120] So here is the preview.
[00:01:03.120 --> 00:01:05.520] We hope you enjoy it as much as we did.
[00:01:05.520 --> 00:01:17.760] If you do, you can get Douglas Adams' The Ends of the Earth now at Audible, Spotify, Pushkin.fm/slash audiobooks, or wherever audiobooks are sold.
[00:01:21.600 --> 00:01:22.800] Chapter 3.
[00:01:22.800 --> 00:01:24.560] Everything is Connected.
[00:01:24.560 --> 00:01:26.880] The Internet We Didn't Get.
[00:01:27.200 --> 00:01:35.840] The computer scientist Danny Hillis came up with a brilliant definition of technology, which is technology is stuff that doesn't work yet.
[00:01:37.440 --> 00:01:43.600] Now, the interesting thing is everything we've invented was at some point technology.
[00:01:43.600 --> 00:01:46.640] I mean, a chair was technology before we'd figured it out.
[00:01:46.640 --> 00:01:51.760] Now we know what a chair is, but before we did, you know, people were trying to figure out, well, how many legs should it have?
[00:01:51.760 --> 00:01:52.960] How high should it be?
[00:01:53.200 --> 00:01:55.760] And until we got it, they used to crash.
[00:01:56.080 --> 00:02:02.520] Now, of course, we know everything you need to know about a chair, so we no longer think of chairs as technology, we just use them.
[00:02:02.520 --> 00:02:10.200] Now, it's very odd that we still tend to think of bathroom taps as technology, and we keep on reinventing them and reinventing them.
[00:02:10.200 --> 00:02:16.760] Nowadays, particularly in the hotel, you can hardly walk into a washroom without you've got to figure it out all over again.
[00:02:16.760 --> 00:02:17.640] Now, how do I open?
[00:02:18.280 --> 00:02:20.120] Am I responsible for turning it on?
[00:02:20.120 --> 00:02:21.400] Do I do it with my hands?
[00:02:21.400 --> 00:02:22.680] Do I just look at it?
[00:02:22.680 --> 00:02:24.920] Do I use my elbows?
[00:02:24.920 --> 00:02:27.560] And then you think, now who's responsible for turning it off?
[00:02:27.560 --> 00:02:28.920] Do I just walk away?
[00:02:28.920 --> 00:02:37.880] Or you know, we can't leave the damn things alone because we're so relentlessly, ceaselessly inventive.
[00:02:37.880 --> 00:02:41.560] So we keep on inventing stuff maybe that doesn't need to be reinvented.
[00:02:41.720 --> 00:02:45.800] I would think it's probably time to call a moratorium on bathroom taps.
[00:02:45.800 --> 00:02:51.960] Chairs and bathroom taps aside, Douglas was obsessed with all types of technology.
[00:02:51.960 --> 00:02:57.560] But the specific type of technology that obsessed Douglas above all else was computing.
[00:02:57.560 --> 00:03:03.160] And specifically, what happens when you start to connect computers together on a network?
[00:03:03.160 --> 00:03:09.720] Now, many years ago, I invented a thing called the Hitchhiker's Guide to the Galaxy.
[00:03:10.680 --> 00:03:15.480] I never meant to be a predictive science fiction writer in the mode of, say, Arthur C.
[00:03:15.480 --> 00:03:16.520] Clarke.
[00:03:16.520 --> 00:03:24.360] My reason for inventing it was purely one of narrative necessity, which is that I had lots of extra bits of story that I didn't know what to do with.
[00:03:25.000 --> 00:03:30.840] I didn't know anything at all about technology in those days, and I didn't think twice about any of the issues.
[00:03:30.840 --> 00:03:34.280] So I just made it a bit like things I was already familiar with.
[00:03:34.280 --> 00:03:37.880] It was a bit like a pocket calculator, a bit like a TV remote.
[00:03:37.880 --> 00:03:41.880] It had a little window at the top and lots and lots of little buttons on the front.
[00:03:41.880 --> 00:03:53.920] The Hitchhiker's Guide to the Galaxy, as it manifests itself in the five novels that bear its name, is an anarchic, attitude-filled, user-generated publication.
[00:03:53.920 --> 00:04:01.600] Today, we would call it an app, Wikipedia on ACID, with aggressive social media functionality.
[00:04:01.600 --> 00:04:14.160] But when Douglas Adams first described it in 1979, neither the iPad, nor the internet, nor Wikipedia, nor social media had been invented.
[00:04:14.480 --> 00:04:25.680] One of the strangest things about the blurring between fiction and real life in Douglas's career was not that he predicted technologies that later came to pass.
[00:04:25.680 --> 00:04:30.240] Many science fiction writers, as Douglas pointed out himself, from Arthur C.
[00:04:30.240 --> 00:04:33.040] Clarke to William Gibson, have done that.
[00:04:33.360 --> 00:04:48.160] Where Douglas is unusual is that 20 years after writing a book about the guide, he founded a production company, the Digital Village, with a team that included top technologists and internet pioneers.
[00:04:48.160 --> 00:04:55.840] And then he set about trying to actually build a digital guide to the world in real life.
[00:04:56.160 --> 00:04:59.120] But let us not get ahead of ourselves.
[00:04:59.120 --> 00:05:14.720] To understand how deep and old Douglas's love for technology in general and network computing in particular ran, and the strange role he ended up playing in the history of the internet, we need to start at the very beginning.
[00:05:14.720 --> 00:05:18.880] And who was at the very beginning of Douglas's love affair with computers?
[00:05:18.880 --> 00:05:21.440] Stephen Fry, of course.
[00:05:22.720 --> 00:05:25.840] We started talking about computers.
[00:05:25.840 --> 00:05:30.520] And you have to remember that computers then meant isolated objects on a desk.
[00:05:29.600 --> 00:05:33.160] There was no networking, there was no internet.
[00:05:33.480 --> 00:05:42.760] There was, I mean, there were sort of vague versions of it within universities, but there was no way you could get onto the internet from home.
[00:05:42.760 --> 00:05:48.920] And we both, I think, understood that that was the future of computing.
[00:05:48.920 --> 00:06:02.840] And we both were aware of something that became very important in our lives: that Apple computers, who were big in the 70s, they really kick-started the whole home computing craze.
[00:06:02.840 --> 00:06:06.360] They were bringing out a new and rather unusual sort of computer.
[00:06:06.360 --> 00:06:11.400] We met then, I guess, in 83, maybe it was mid-83, something like that.
[00:06:11.720 --> 00:06:19.080] And we were aware that in January 84, Apple were bringing out a new computer, which they were going to call the Macintosh.
[00:06:19.080 --> 00:06:24.280] And I was able to tell him, which he didn't know, that it was called the Macintosh.
[00:06:24.440 --> 00:06:30.920] Nothing to do with raincoats, but because Macintosh is a kind of apple.
[00:06:30.920 --> 00:06:36.040] It's like a Golden Delicious or a Granny Smith or a Cox's Pippin.
[00:06:36.040 --> 00:06:39.240] But there is the Macintosh apple, very popular in America.
[00:06:39.240 --> 00:06:40.120] Hence, you know.
[00:06:40.120 --> 00:06:41.640] So it's an apple.
[00:06:41.640 --> 00:06:47.880] Anyway, I could get bogged down in all these ridiculous memories that would mean nothing to anybody else.
[00:06:47.880 --> 00:06:52.600] But we were, and this is the story I like to tell, and he told it too.
[00:06:52.600 --> 00:06:59.560] So we were the first two people in Britain, possibly Europe, to buy an Apple Macintosh.
[00:06:59.880 --> 00:07:09.080] It is worth mentioning in passing that for some years, Stephen and Douglas would argue about who was actually first in line at the Mac store that day.
[00:07:09.080 --> 00:07:13.800] Whomever it was, Douglas's love for computers predated the Macintosh.
[00:07:13.800 --> 00:07:17.120] Here he is, describing love at first sight.
[00:07:14.520 --> 00:07:20.800] But I remember the first time I ever saw a computer.
[00:07:21.680 --> 00:07:27.200] It was in Lasky's in the Totten Court Road, a fond memory.
[00:07:27.840 --> 00:07:29.760] And it was a Commodore pet.
[00:07:30.320 --> 00:07:37.120] I don't know if you remember the very, very early one, which is like a sort of pyramid with a little sort of screen, yay big.
[00:07:37.760 --> 00:07:42.000] And I sat and stared at it.
[00:07:42.320 --> 00:07:46.080] And I was absolutely fascinated by it.
[00:07:46.080 --> 00:07:56.080] And I was trying to figure out any reason, any possible reason why this could have any use to me.
[00:07:56.400 --> 00:08:05.200] And I just couldn't see any, and I was thinking as hard as I could, but I couldn't see any way in which a computer could be useful to me because I was a writer.
[00:08:05.200 --> 00:08:08.640] And I just didn't have that much stuff to add up.
[00:08:12.160 --> 00:08:18.800] And that was because I'd got the wrong idea of what a computer was.
[00:08:18.800 --> 00:08:22.400] I thought it was a kind of super adding machine.
[00:08:22.400 --> 00:08:26.880] But everybody thought it was a kind of super adding machine.
[00:08:26.880 --> 00:08:32.320] That was the model we had in our mind of what a computer was, a super adding machine.
[00:08:32.320 --> 00:08:37.440] And so we developed it as an adding machine with a long feature list.
[00:08:37.760 --> 00:08:44.320] And after a while, we begin to think, well, we're now getting very good at adding up and manipulating these numbers.
[00:08:44.320 --> 00:08:45.920] Is there anything else we can do with it?
[00:08:45.920 --> 00:08:49.440] I mean, supposing we made these numbers stand for something else.
[00:08:49.440 --> 00:08:54.720] Like we made it stand for the letters of the alphabet, for ASCII code.
[00:08:54.720 --> 00:09:01.800] And we suddenly think, my God, we have had such limited imagination.
[00:09:00.000 --> 00:09:07.000] We have failed to see the really exciting possibilities of the machine.
[00:09:07.160 --> 00:09:08.840] How short-sighted we've been.
[00:09:08.840 --> 00:09:12.520] It isn't an adding machine, it's a typewriter.
[00:09:12.840 --> 00:09:17.160] So we developed it as a typewriter with a long feature list.
[00:09:17.160 --> 00:09:22.040] And then after a while, we get on a roll and we think, well, what else can we make these numbers stand for?
[00:09:22.040 --> 00:09:26.200] We can make them stand for the elements of a graphic display.
[00:09:27.800 --> 00:09:30.680] And we come up with yet another paradigm shift.
[00:09:30.680 --> 00:09:35.240] We think, my God, how short-sighted of us to think that it was just an adding machine or a typewriter.
[00:09:35.240 --> 00:09:37.400] It's something much more exciting than any of these.
[00:09:37.400 --> 00:09:42.040] It's a television with a typewriter sat in front of it.
[00:09:43.000 --> 00:09:49.720] And then with the invention of the World Wide Web, we go to yet another paradigm shift and we think, my God, it isn't any of these.
[00:09:49.720 --> 00:09:50.680] It isn't any of these.
[00:09:50.680 --> 00:09:51.880] It isn't just a typewriter.
[00:09:51.880 --> 00:09:52.920] It isn't just an adding machine.
[00:09:52.920 --> 00:09:54.360] It isn't a television.
[00:09:54.360 --> 00:09:56.600] It's a brochure.
[00:09:57.240 --> 00:10:06.520] In less than three minutes, almost as a throwaway gag, Douglas sketches out a 20-year intellectual history of personal computing.
[00:10:06.520 --> 00:10:26.440] From adding machine to typewriter to television to brochure, Douglas skewers how at every stage we have had a dramatically limited understanding of how computers were truly transformational, particularly when combined with the power of networked connectivity.
[00:10:27.080 --> 00:10:37.640] The specificity and power of Douglas's vision of what the internet was and what it could be were breathtakingly ahead of his time.
[00:10:37.640 --> 00:10:45.000] Here he is in 2001, speaking to a collection of mobile phone and network technologists and executives.
[00:10:46.480 --> 00:11:29.840] Imagine if every piece of information we ever generated about the world that passed through a computer, whether it's a restaurant typing up its menu for the evening, whether it's a shop maintaining its stock list, whether it's a car noticing what speed it's going, how much petrol it's got left, and where the nearest service stations are, what prices they're charging for petrol are, or whether it's someone measuring the wingspan of an African swallow, or writing down where and when their grandmother was born, or whether it's someone taking a digital photograph from the top of the Great Pyramid, or just a flower that's bloomed late or early this year, or the settings in your thermostat and when it turned on and off.
[00:11:29.840 --> 00:11:34.880] Or if every time you took your child's temperature, the network remembered.
[00:11:35.200 --> 00:11:43.440] Imagine all of that gradually creating a vast shared software model of the world.
[00:11:43.760 --> 00:11:45.520] Just imagine it.
[00:11:47.760 --> 00:11:58.720] The one thing I did get right when I came up with the Hitchhiker's Guide to the Galaxy was that it would know where you were and come up with information appropriately.
[00:11:59.040 --> 00:12:04.960] And that, of course, is the thing that makes the crucial difference to everything in that list above.
[00:12:04.960 --> 00:12:18.320] If every piece of information knew the when of itself and the where of itself, so that the virtual world we created fitted over the real world like an invisible glove.
[00:12:18.640 --> 00:12:32.120] And these devices, these devices that we currently think of as telephones or PDAs, would be the devices that made that invisible world visible to us.
[00:12:32.680 --> 00:12:47.560] Today, in 2025, when we all walk about with geolocated supercomputers in our pockets, what Douglas is describing seems simply an accurate description of the world in which we now live.
[00:12:47.560 --> 00:12:56.600] Back in 2001, it was insane science fiction, except for the companies who were trying to build it.
[00:12:56.600 --> 00:13:12.280] And one of those companies was Douglas's own digital village, where he had assembled an enormously talented group of people to try and make, as he described it, the Earth edition of The Hitchhiker's Guide to the Galaxy.
[00:13:12.280 --> 00:13:14.360] Here's Robbie Stamp.
[00:13:14.680 --> 00:13:29.080] I'd written unbidden a business plan called Cable City, which was a recognition that to take advantage of new cable and satellite opportunities, you needed to create an entirely new cost base for programming.
[00:13:29.080 --> 00:13:35.160] And I was telling Douglas about it, and it had morphed by then into something called the Digital Village.
[00:13:35.160 --> 00:13:39.240] The Marshall McLuhan quote influenced that, the digital village.
[00:13:39.240 --> 00:13:45.400] And I was sitting there with Douglas, you know, several of our meetings in his house, and he said, oh, that sounds interesting.
[00:13:45.400 --> 00:13:46.840] I'd like to be your partner.
[00:13:46.840 --> 00:13:49.000] How much would it cost me to invest?
[00:13:49.000 --> 00:13:52.600] So I plucked a figure out of the air and he said, I'm in.
[00:13:52.600 --> 00:13:53.320] And that was it.
[00:13:53.320 --> 00:14:00.920] We were then 50-50 partners in this wonderful, one of the best adventures of my working life called the Digital Village.
[00:14:01.240 --> 00:14:09.080] So Robbie and Douglas launched this company, The Digital Village, and they recruit almost at once an amazing group of people.
[00:14:09.080 --> 00:14:11.080] And they take the moonshot.
[00:14:11.080 --> 00:14:14.200] They're going to build the Hitchhiker's Guide.
[00:14:14.200 --> 00:14:19.360] Here's Richard Harris, who served as Chief Technology Officer of the Digital Village.
[00:14:19.920 --> 00:14:23.920] Remember, this is the days before the term social network had been actually coined.
[00:14:23.920 --> 00:14:33.280] You know, you had Tripod and MySpace and all of the CompuServe forums at the time, but very much precursors of social networks.
[00:14:33.280 --> 00:14:42.880] And of course, the Hitchhiker's Guide itself was meant to be an opinionated counterpoint to the Encyclopedia Galactica, the old Azimorph construct.
[00:14:42.880 --> 00:14:45.760] But we didn't have an Encyclopedia Galactica at the time.
[00:14:45.760 --> 00:14:50.800] If we were doing it now, we would be building an opinionated counterpoint to Wikipedia.
[00:14:50.800 --> 00:14:54.000] And I would love the challenge of doing that.
[00:14:54.000 --> 00:14:54.720] But we didn't.
[00:14:54.720 --> 00:15:06.640] So we had to come up with a content-driven network of people that created its own content that was self-policing and created trust between people.
[00:15:06.640 --> 00:15:18.880] And what we were all about was delivering people to each other via content and helping them collaboratively build content, which is actually something I think we succeeded on very well with H2G2.
[00:15:19.200 --> 00:15:26.480] Now, there were a lot of people cashing in on the first dot-com boom of the late 90s.
[00:15:26.480 --> 00:15:32.720] Lots of people trying to build social networks and internet sites.
[00:15:32.720 --> 00:15:38.880] But note the specificity of Douglas's vision as described by Harris.
[00:15:39.520 --> 00:15:45.840] We were never in the business of doing a Facebook and simply delivering eyeballs to advertisers.
[00:15:46.160 --> 00:15:52.240] That's the fundamental contradiction in most social networks: they're not about supporting people.
[00:15:52.240 --> 00:15:53.920] They're not about building community.
[00:15:53.920 --> 00:15:57.680] They're about selling that community to paying advertisers.
[00:15:57.680 --> 00:16:00.600] And I think that's a fundamentally flawed model.
[00:15:59.840 --> 00:16:05.080] You see systems that can see what's happening and adapt to them in real time.
[00:16:05.720 --> 00:16:13.320] And you need to give people not just what they knew to ask for, but what they needed to know but didn't know to ask for.
[00:16:13.320 --> 00:16:24.280] And that was, if you like, a fundamental strapline for our development of the real hitchhiker's guide: if you're giving people what they ask for, you're just a search engine.
[00:16:24.280 --> 00:16:30.680] You may be a sophisticated search engine, you may be a constrained, community-driven search engine, but you're still a search engine.
[00:16:30.680 --> 00:16:39.320] But you need to be building an understanding of people sufficient to let them find the stuff that they didn't know to ask for.
[00:16:39.320 --> 00:16:51.080] Part of our challenge was actually: okay, how do we build a knowledge framework that profiles people for their own benefit, not for advertisers' benefit, and helps put them in touch with who and what they needed to know.
[00:16:51.080 --> 00:16:58.120] So it's people to people, people to content, people and content to place, putting all of those together.
[00:16:58.120 --> 00:17:05.160] Now, you might suspect that there is an element of hindsight at work here in Richard's description.
[00:17:05.160 --> 00:17:15.320] But I have the advantage of actually having been in the room when h2g2.com was being planned and built.
[00:17:15.320 --> 00:17:30.360] And I can attest that this thoughtfulness about the dangers of what social media could become and the importance of editors, fact-checkers, and community self-regulation were built into its early design.
[00:17:30.360 --> 00:17:40.200] Those insights came not only from the official technologists and scientists at the company, but often intuitively from Douglas himself.
[00:17:40.520 --> 00:17:42.280] Here's Robbie Stamp again.
[00:17:42.600 --> 00:17:44.520] Everything is connected.
[00:17:44.800 --> 00:17:53.200] I think that deep sense at his core, I think so much of what he explores creatively through almost all of his work is that idea.
[00:17:53.200 --> 00:17:56.960] Everything is connected and everything is perspectival.
[00:17:57.280 --> 00:18:11.280] And I think he saw in the internet an emergent technology which was going to sort of in a way make, if I can use this, make flesh that interconnectedness of things.
[00:18:11.920 --> 00:18:20.960] It were millions and millions and millions of nodes which were going to be connecting in new ways, some of which we could maybe predict and many we couldn't.
[00:18:20.960 --> 00:18:41.920] And I think he found that emergent property intellectually huge and exciting because I think it was an enormous river flowing into his creativity, which was maybe one of the most important whirlsprings in my mind was that natural fascination in perspective and in connectivity.
[00:18:41.920 --> 00:18:55.440] And I think that the internet was there, he just saw it very, very, very early on as this massive playground where so much was going to be possible that hadn't been possible going through traditional gatekeepers.
[00:18:56.160 --> 00:19:06.480] But I think Douglas, when I look back to those early conversations, was phenomenally prescient about an awful lot of the ways in which the internet duly has evolved.
[00:19:06.480 --> 00:19:21.680] I think maybe some of the wilder, nastier aspects or the more difficult aspects of what we've learned about social media and the internet, he kind of died before we'd really seen a lot of that.
[00:19:21.680 --> 00:19:34.040] And I think before, you know, the internet found its business model, which was turning human beings into financial derivative products who were sliced, diced, and sold, and resold, and sold in these massive advertising trading marketary.
[00:19:34.600 --> 00:19:41.080] So I think he was there in the early optimistic, more wide-eyed days of what was going to be possible.
[00:19:41.080 --> 00:19:47.880] But I think it was that freedom and that recognition of the interconnectedness of everything that thrilled him.
[00:19:48.200 --> 00:20:01.080] This idea, the idea that everything is connected, is one that Douglas rode to logical but absurd extremes in his Dirk Gently's Holistic Detective Agency books.
[00:20:01.720 --> 00:20:07.880] The term holistic refers to my conviction that what we are concerned with here is the fundamental interconnectedness of all things.
[00:20:08.040 --> 00:20:14.120] I do not concern myself with such petty things as fingerprint powder, telltale pieces of pocket fluff, and inane footprints.
[00:20:14.120 --> 00:20:18.280] I see the solution to each problem as being detectable in the pattern and web of the whole.
[00:20:18.280 --> 00:20:26.280] The connections between causes and effects are often much more subtle and complex than we, with our rough and ready understanding of the physical world, might naturally suppose.
[00:20:27.240 --> 00:20:32.920] Dirk Gently is a detective with a sort of superpower.
[00:20:32.920 --> 00:20:40.280] He has a semi-psychic ability to sense the deep interconnections between all things.
[00:20:40.600 --> 00:20:48.360] He intuits which people, which events, which locations are relevant to his current case.
[00:20:48.360 --> 00:20:56.040] And his methods, whilst unconventional, are normally successful or semi-successful.
[00:20:56.040 --> 00:21:00.440] He solves cases with questionable efficiency.
[00:21:00.440 --> 00:21:13.800] The great irony of Dirk Gently as a character is that this hero with a gift for sensing connections is himself profoundly bad at making human connections.
[00:21:13.800 --> 00:21:18.720] And he spends most of his life alone and lonely.
[00:21:19.040 --> 00:21:33.360] Without wanting to post-mortem psychoanalyze Douglas Adams, it is perhaps worth asking why he was so excited to become part of the Digital Village and to run a production company.
[00:21:33.360 --> 00:21:41.360] Here's Ian Charles Stewart, co-founder of Wyatt Magazine and another of Douglas' partners at the Digital Village.
[00:21:41.360 --> 00:21:44.640] But the main reason we got together wasn't because there was a single driven purpose.
[00:21:44.640 --> 00:21:46.720] It was because he had writer's block.
[00:21:46.720 --> 00:21:56.880] And the challenge was how to create a team around him that allowed creativity to re-emerge for him to be able to continue telling stories in a different way.
[00:21:56.880 --> 00:22:04.560] And so the idea was to create a platform which he could use to tell stories in different media, in new media.
[00:22:04.560 --> 00:22:09.600] The first, of course, was the computer game, Starship Titanic, and we created a team around that.
[00:22:09.600 --> 00:22:11.840] And then the second was thinking about the film.
[00:22:11.840 --> 00:22:25.120] And the notion was that if we gave him, if we took him away from the typewriter and paper and put him in front of new tools with other people, that that might become something which he could get his teeth into and have fun with.
[00:22:25.120 --> 00:22:27.040] And it turned out to be the case.
[00:22:28.000 --> 00:22:31.840] Sophie Aston was Douglas' assistant at the Digital Village.
[00:22:31.840 --> 00:22:38.480] Sophie and I first met when we were both in our early 20s and excited to be around the great Douglas Adams.
[00:22:38.480 --> 00:22:45.120] Here she is recollecting what it was like to have Douglas in an office.
[00:22:45.120 --> 00:23:02.040] He was such a, you know, a social being, a social animal, that actually what he was really enjoying in those last few years was working in an office environment with lots of interesting, young, creative, brilliant people on Starship Titanic, on HTG2.
[00:22:59.760 --> 00:23:05.560] That was what gave, that's what fed him, you know, the excitement.
[00:23:05.800 --> 00:23:08.200] Yeah, he was in the office a lot of the time.
[00:23:08.200 --> 00:23:12.200] He probably didn't need to be, but he really enjoyed spending time in our company, I think.
[00:23:12.520 --> 00:23:22.840] What's crucial in this story is to realize that unlike the leaders of a lot of other internet companies, Douglas Adams wasn't just a technologist.
[00:23:22.840 --> 00:23:27.880] He was both a technologist and a world-class creative.
[00:23:27.880 --> 00:23:32.680] And he was also someone deeply interested in human connection.
[00:23:32.680 --> 00:23:42.520] That is to say, what really interested him was how networked technology could be life-enhancing to humans and to our relationships.
[00:23:42.520 --> 00:23:49.960] Here he is in the year 2000, giving another keynote address, this time to a group of mobile phone executives.
[00:23:49.960 --> 00:23:54.760] He imagines a world of virtual reality and online dating.
[00:23:55.080 --> 00:24:03.480] Note that he uses the phrase soft world to denote what we would probably call the digital or virtual world today.
[00:24:04.440 --> 00:24:11.000] In the real world, if you want to know what the view from the top of the Eiffel Tower looks like, you have to go there.
[00:24:11.000 --> 00:24:17.240] In the soft world, you can ask your guide to show you what it looks like from the top of the Eiffel Tower.
[00:24:17.240 --> 00:24:21.080] Maybe it can show you the view through a webcam there instantly.
[00:24:21.080 --> 00:24:31.720] Or maybe it can construct a virtual reality view for you in real time based on everything it knows about everything that can currently be seen from the top of the Eiffel Tower.
[00:24:31.720 --> 00:24:35.240] So the first thing you would see is a lot of virtual graffiti.
[00:24:35.240 --> 00:24:39.160] Some of it quite naughty being French, and you just sort of wave it away.
[00:24:39.160 --> 00:24:47.200] And as you look down at your virtual view, you ask it how many of the cars you can see from your current vantage point, for instance, are British.
[00:24:44.840 --> 00:24:49.200] For a moment, you think it's not working.
[00:24:49.840 --> 00:24:53.920] Then you say, oh, okay, okay, so how many of them are Peugeots?
[00:24:53.920 --> 00:24:57.120] And the view lights up with thousands of moving dots.
[00:24:57.120 --> 00:25:02.960] Okay, how many of the cars you can currently see have some Bach playing on the in-car stereos?
[00:25:02.960 --> 00:25:04.400] A few dozen.
[00:25:04.400 --> 00:25:08.240] Oh, and there's one playing your favorite recording of the Schubler Preludes.
[00:25:08.240 --> 00:25:09.760] Do they have their flag up?
[00:25:09.760 --> 00:25:14.720] Yes, she'll talk to you, but only because the only thing you asked her about was what she was listening to.
[00:25:14.720 --> 00:25:17.280] Anything else and her flag would have been down for you.
[00:25:17.280 --> 00:25:21.120] You chat for a bit about the music and quickly discover a tremendous rapport.
[00:25:21.120 --> 00:25:22.720] What about having dinner together?
[00:25:22.720 --> 00:25:26.640] Okay, but she has a gluten problem, which restricts where she can eat.
[00:25:26.640 --> 00:25:28.240] And you like turbot.
[00:25:28.240 --> 00:25:31.040] So a couple of restaurants light up in your view.
[00:25:31.040 --> 00:25:35.440] One of them looks great for a romantic tryst, lots of alcoves and dim lighting.
[00:25:35.440 --> 00:25:47.600] But some of the people who have eaten there tonight have left notes around the place saying, left around in cyberspace, saying that they're obviously understaffed in the kitchen tonight and the food has been coming out cold or reheated.
[00:25:47.600 --> 00:25:51.120] The other place gets raves about the food, but it's a bit bright and noisy.
[00:25:51.120 --> 00:25:53.280] You decide that good food is the thing to go for.
[00:25:53.280 --> 00:25:59.040] Then you remember, damn, you're not actually in Paris, you're in New Delhi and got a bit carried away.
[00:26:00.320 --> 00:26:01.840] That's all right, says your new friend.
[00:26:01.840 --> 00:26:03.200] I'm actually in Albuquerque.
[00:26:03.200 --> 00:26:05.040] I'm a music squatter.
[00:26:05.360 --> 00:26:06.320] What does that mean?
[00:26:06.320 --> 00:26:10.640] Well, she just monitors the network for anybody who's looking for someone who's listening to that recording.
[00:26:10.640 --> 00:26:17.040] The real occupant of the car had his flag down and wasn't talking to anybody, so she just intercepted your query and really quite enjoyed talking to you.
[00:26:17.040 --> 00:26:18.800] And now she's going out to dinner locally.
[00:26:18.800 --> 00:26:19.120] Thanks.
[00:26:19.120 --> 00:26:19.840] Bye.
[00:26:20.800 --> 00:26:25.840] So you track which restaurant she's going to and send a margarita to her table to say thanks.
[00:26:25.840 --> 00:26:29.120] But she's annoyed that you tracked her down and turns the connection off.
[00:26:29.120 --> 00:26:33.000] Oh well, you go back to your day job erecting advertising hoardings on Mars.
[00:26:33.480 --> 00:26:37.720] Soft Mars, that is, which has recently been added to the soft solar system.
[00:26:39.320 --> 00:26:52.600] It is archetypal Douglas Adams to imagine that the highest calling of the internet would be to enable a failed romantic tryst between two lonely bach enthusiasts in Paris.
[00:26:52.920 --> 00:27:06.040] But if he foresaw internet romance, Douglas also predicted catfishing and even more serious problems that a free-for-all information error would usher into our world.
[00:27:06.360 --> 00:27:13.640] Here is an extract from an article that Douglas wrote in 1999 about the issues of online trust.
[00:27:13.640 --> 00:27:16.680] Read once again by Samuel Barnett.
[00:27:17.320 --> 00:27:22.040] The internet is so new, we still don't really understand what it is.
[00:27:22.040 --> 00:27:27.320] We mistake it for a type of publishing or broadcasting because that's what we're used to.
[00:27:27.320 --> 00:27:36.680] So people complain that there's a lot of rubbish online or that it's dominated by Americans or that you can't necessarily trust what you read on the web.
[00:27:36.680 --> 00:27:41.240] Imagine trying to apply any of those criticisms to what you hear on the telephone.
[00:27:41.240 --> 00:27:50.440] Of course you can't trust what people tell you on the web any more than you can trust what people tell you on megaphones, postcards, or in restaurants.
[00:27:50.440 --> 00:28:00.440] Working out the social politics of who you can trust and why is quite literally what a very large part of our brain has evolved to do.
[00:28:00.760 --> 00:28:17.200] For some batty reason, we turn off this natural scepticism when we see things in any medium which require a lot of work or resources to work in, or in which we can't easily answer back, like newspapers, television, or granite, hence carved in stone.
[00:28:17.840 --> 00:28:23.040] What should concern us is not that we can't take what we read on the internet on trust.
[00:28:23.040 --> 00:28:24.320] Of course, you can't.
[00:28:24.320 --> 00:28:26.000] It's just people talking.
[00:28:26.000 --> 00:28:32.240] But that we ever got into the dangerous habit of believing what we read in the newspapers or saw on the TV.
[00:28:32.240 --> 00:28:37.280] A mistake that no one who has met an actual journalist would ever make.
[00:28:37.920 --> 00:28:44.640] One of the most important things you learn from the internet is that there is no them out there.
[00:28:44.640 --> 00:28:47.040] It's just an awful lot of us.
[00:28:48.000 --> 00:28:56.720] Those of us who remember the early days of social media in the heady 2000s remember it as a happy place.
[00:28:56.720 --> 00:29:07.280] Twitter was this endless tea party where celebrities were your friends and there was an ongoing, kind, intellectual, collaborative conversation.
[00:29:07.280 --> 00:29:14.960] As Douglas says, there was no them, just an awful lot of the best of us.
[00:29:14.960 --> 00:29:26.160] Early Twitter, perhaps, gave a glimpse of the sort of social media that Douglas Adams wanted to build, the guide's vision of the internet.
[00:29:26.160 --> 00:29:28.800] We didn't get that, of course.
[00:29:28.800 --> 00:29:35.760] Instead, we've ended up with an internet in which profit-reaping algorithms prioritize clickbait.
[00:29:35.760 --> 00:29:41.840] That is to say, they prioritize division, anger, and fear.
[00:29:42.160 --> 00:29:53.040] Instead of getting a place where minds can meet, we are being driven to our separate echo chambers and increasingly our separate social networks.
[00:29:53.040 --> 00:30:06.280] We are seeing increasing evidence that social media is a mental health threat, particularly for the young, and a profound threat to social cohesion on the national level.
[00:30:06.280 --> 00:30:09.000] None of this was the plan.
[00:30:09.000 --> 00:30:18.520] Where we have ended up is a sorry, dark shadow of what we had all hoped for at the dawn of the internet.
[00:30:18.520 --> 00:30:21.240] What would Douglas make of all this?
[00:30:21.240 --> 00:30:37.560] We can't know the answer, of course, but perhaps as a proxy, we could ask the people who worked alongside him in those early glory days, the people who strived alongside him, trying to build the kind of internet he wanted to see.
[00:30:37.560 --> 00:30:39.160] Here's Richard Harris.
[00:30:39.480 --> 00:30:48.280] Most social networks, they're not about supporting people, they're not about building community, they're about selling that community to paying advertisers.
[00:30:48.280 --> 00:30:59.240] And I think that's a fundamentally flawed model, as we're seeing in extremists with what's happened to Twitter now and to a large extent what's happening with Facebook.
[00:30:59.240 --> 00:31:05.560] Here's Wired co-founder Ian Charles Stewart with a slightly more nuanced view.
[00:31:05.880 --> 00:31:11.960] First of all, it's easy to beat social media up these days because everybody does it and everybody complains about it and everybody uses it as an excuse for everything.
[00:31:11.960 --> 00:31:20.120] It is nonetheless the way a whole bunch of new people have generated careers for themselves, have been able to get access to communities they would never have found otherwise.
[00:31:20.120 --> 00:31:24.200] The whole notion of long-tail communities is real because of social media.
[00:31:24.200 --> 00:31:31.720] If you cared about something but lived in rural New Zealand in the South Island and had no one else who cared about what you cared about on the internet, you can find them, and social media allows you to do that.
[00:31:31.720 --> 00:31:35.880] So we shouldn't forget the good bits as we complain about the bad bits.
[00:31:35.880 --> 00:31:53.760] But yes, I think we have at this current stage of development, and it's true for all technologies, we have people who are comfortable and feel like they're riding the wave, and then we have those people that feel like every time they try to put their head above water, they're crushed by the wave.
[00:31:53.760 --> 00:31:59.440] Those people tend to feel disenfranchised, they tend to feel angry, and therefore you end up with Twitter commentaries and trolls.
[00:31:59.440 --> 00:32:02.000] Same is true on your YouTube channel, same is true on all media.
[00:32:02.000 --> 00:32:13.280] And I think it's not easy to carry everybody along at the same pace when the world is being so rapidly changed in so many different ways.
[00:32:13.280 --> 00:32:14.960] Social media is just one part of that.
[00:32:14.960 --> 00:32:18.640] I think it's, if anything, it's a microcosm of what's happening in society more broadly.
[00:32:18.640 --> 00:32:22.160] I do think it points to governance challenges.
[00:32:22.160 --> 00:32:32.640] The question as to whether it's the nasty corporates or it's the misleading governments that are most to blame, I think is an open one.
[00:32:32.640 --> 00:32:41.600] The veneer of respectability that's assigned to governments because we theoretically vote for them, I don't think is all that it's cracked up to me.
[00:32:41.600 --> 00:32:52.080] We only have to look at voting in Venezuela, France, Austria, or heavens help us, the US, to question that veneer of respectability.
[00:32:52.080 --> 00:33:00.240] So I don't blame CEOs of big companies, nor blame necessarily heads of government.
[00:33:00.240 --> 00:33:01.360] I think it's a societal thing.
[00:33:01.360 --> 00:33:03.440] We just have to manage our way through.
[00:33:03.440 --> 00:33:09.120] And I think I don't mind, as a member of that society, where that help comes from or where that guidance comes from.
[00:33:09.120 --> 00:33:14.960] And I think sometimes the humor and wit of someone like Douglas is helpful in helping guide people through it.
[00:33:14.960 --> 00:33:24.240] James Goss, the novelist and my old school friend, in 1996 built the first website for a theatre show in the UK.
[00:33:24.240 --> 00:33:32.440] Five years later, James, for the BBC, produced the first webcast of a memorial service in internet history.
[00:33:32.760 --> 00:33:38.920] The memorial service in question was sadly and inevitably that of Douglas Adams.
[00:33:38.920 --> 00:33:44.440] My point is, James Goss, like Douglas, is an early adopter of technology.
[00:33:44.760 --> 00:34:01.640] I really think Douglas would have been amazing on social media, whilst also being the first person to point out that pouring all of your heart and soul and details of your entire life out there is a very stupid thing to do, and also none of it has any meaning.
[00:34:01.640 --> 00:34:04.040] You know, he would have been the first person to diagnose this.
[00:34:04.360 --> 00:34:06.200] And here's Stephen Fry.
[00:34:06.200 --> 00:34:11.880] Stephen, at one point in the early 2000s, had the biggest Twitter following in the world.
[00:34:11.880 --> 00:34:15.720] He has since withdrawn from all social media.
[00:34:16.040 --> 00:34:17.000] How do I put it?
[00:34:17.000 --> 00:34:19.400] I won't say he was fortunate in dying.
[00:34:19.400 --> 00:34:31.400] I mean, his mind when he died had really been unpolluted by what happened to the internet and by the invention of social media and then what happened to that.
[00:34:31.720 --> 00:34:36.200] And he would, of course, have been angry and disappointed and upset, as we all were.
[00:34:36.200 --> 00:34:41.080] And he predicted, yes, you could say he predicted the iPad.
[00:34:41.080 --> 00:34:47.320] You could say he predicted all kinds of ways that you could interface with technology.
[00:34:47.320 --> 00:35:03.480] But he didn't predict in a strange way the most important and terrible fact of technology is that it was still in the hands of these gibbering apes that are human beings and that they would despoil it and that it wouldn't improve them.
[00:35:03.480 --> 00:35:14.680] They would inshitify, to use Kori Doktorov's wonderful word, the inshitification of the internet, you know, and so he didn't live to see the inshitification.
[00:35:15.200 --> 00:35:29.760] When he died, it was still a place of tremendous optimism, and the barriers were being broken down, and frontiers, and disputes, and disagreements of longstanding were melting away, and all was going to be solved by the glory.
[00:35:29.760 --> 00:35:37.280] And what he didn't predict was that, far from making humans nicer, it seems to have made us nastier.
[00:35:37.280 --> 00:35:38.000] You know what I mean?
[00:35:38.000 --> 00:35:50.320] I mean, he started with a view of the lumpen bureaucracy in the shape of Vogons, for example, and the vagueness and the hopeless kind of unreliability of scientists and others.
[00:35:50.320 --> 00:36:02.240] And there's no real malevolence there of the kind that every day despoils our culture with lies and unpleasantness.
[00:36:02.560 --> 00:36:03.840] Quick footnote.
[00:36:03.840 --> 00:36:11.280] My editor has requested that I explain Corey Doctorow's concept of the enshittification of the Internet.
[00:36:11.280 --> 00:36:19.440] I think it's one of those words that actually explains itself rather well: enschittification.
[00:36:19.760 --> 00:36:24.480] The enshittification of the Internet.
[00:36:24.480 --> 00:36:26.000] Moving on.
[00:36:26.000 --> 00:36:50.080] Whilst Douglas didn't live to see it happen, I actually suspect that as an endless worrier, Douglas did fear that the technology he loved might, in the wrong hands and used with the wrong motivations, have terrible consequences and would likely, as Stephen Fry puts it, despoil our culture with lies and unpleasantness.
[00:36:50.400 --> 00:36:57.680] Exhibit one in this argument is Mostly harmless, the fifth and final Hitchhiker's book.
[00:36:57.680 --> 00:37:15.480] It's a uniquely bleak book in his canon, the darkest by far of the series, ending with the unambiguous death of all of the much-loved characters: Arthur, Ford, Trillian, Zaphod, and the rest.
[00:37:15.800 --> 00:37:22.120] What is interesting, though, in this context is how and why they die.
[00:37:22.120 --> 00:37:30.760] The guide itself, as we've established, is a pretty good proxy for the internet, a user-generated source of information.
[00:37:30.760 --> 00:37:35.160] The guide is a cross between Wikipedia and a social media platform.
[00:37:35.160 --> 00:37:45.880] Even in the first Hitchhiker book, Douglas hints at some of the problems of a crowd-sourced repository of opinion masquerading as fact.
[00:37:46.840 --> 00:37:55.720] The Hitchhiker's Guide to the Galaxy is an indispensable companion to all those who are keen to make sense of life in an infinitely complex and confusing universe.
[00:37:55.720 --> 00:38:06.680] For though it cannot hope to be useful or informative on all matters, it does at least make the reassuring claim that where it is inaccurate, it is at least definitively inaccurate.
[00:38:06.680 --> 00:38:11.640] In cases of major discrepancy, it's always reality that's got it wrong.
[00:38:11.640 --> 00:38:16.440] This was the gist of the notice: it said, The guide is definitive.
[00:38:16.440 --> 00:38:19.560] Reality is frequently inaccurate.
[00:38:20.200 --> 00:38:24.360] And that's in the first book in 1979.
[00:38:24.360 --> 00:38:41.000] By the time we get to Mostly Harmless, in 1992, the guide has been taken over by a ruthless corporation, Infinidum Enterprises, who plan to use it exclusively for profit and to eliminate all competition.
[00:38:41.000 --> 00:38:44.760] The real-life parallels here scarcely need pointing out.
[00:38:45.120 --> 00:38:55.600] Infinidum uses sophisticated surveillance, data collection, and an autonomous AI to track and manipulate its users.
[00:38:55.600 --> 00:38:59.680] It defiantly asserts its authority over real life.
[00:38:59.680 --> 00:39:02.400] The guide is definitive.
[00:39:02.400 --> 00:39:05.760] Reality is definitely inaccurate.
[00:39:05.760 --> 00:39:23.680] For example, when the Earth stubbornly insists on existing, despite the guide entry on it categorically stating that it had been demolished, the Guide Sentient AI goes about an elaborate scheme to ensure that reality conforms to its description.
[00:39:23.680 --> 00:39:33.840] To do this, it has to destroy not only the Earth, but all Earths in all realities, once and forever.
[00:39:33.840 --> 00:39:50.240] In the final sentences of the book, Arthur Dent realizes what is about to happen: that he, his friends, his daughter, and everyone he has ever loved are about to be destroyed forever.
[00:39:50.880 --> 00:39:56.080] Things began slowly to reassemble themselves in Arthur's mind.
[00:39:56.400 --> 00:40:01.600] He wondered what he should do, but he only wondered it idly.
[00:40:01.920 --> 00:40:05.760] Around him, people were beginning to rush and shout a lot.
[00:40:06.080 --> 00:40:10.720] But it was suddenly very clear to him that there was nothing to be done.
[00:40:10.720 --> 00:40:12.880] Not now or ever.
[00:40:13.520 --> 00:40:22.480] Through the new strangeness of noise and light, he could just make out the shape of Ford Prefect sitting back and laughing wildly.
[00:40:23.440 --> 00:40:27.120] A tremendous feeling of peace came over him.
[00:40:27.760 --> 00:40:38.360] He knew that at last, for once and forever, it was now all finally over.
[00:40:39.640 --> 00:41:01.480] Hearing that today from the perspective of a society which has had its reality terribly distorted by the enchitification of social media, it's impossible not to see it as a very, very bleak satire on a world that had not yet been born when it was written.
[00:41:01.480 --> 00:41:12.520] Douglas always regretted that his final Hitchhiker book was so bleak, and he planned to go back to the series one day and write a happier ending.
[00:41:12.520 --> 00:41:23.240] Sadly, he never got round to doing that, in the same way that he didn't get to stick around and help guide the evolution of social media.
[00:41:23.240 --> 00:41:39.800] So let's give Douglas the last word in this chapter as he looks to the future, to the guide to the internet, the internet that we didn't get, but the internet that the inventor of The Hitchhiker's Guide to the Galaxy had hoped to build.
[00:41:40.120 --> 00:41:51.080] This is an idea I've been pursuing for a while, and since it grew out of the Hitchhiker's Guide to the Galaxy, I called it H2G2 and started it out as a community website.
[00:41:51.080 --> 00:41:59.160] A community that is of voluntary researchers starting to build the very guide that they would then be able to use.
[00:41:59.160 --> 00:42:01.400] A collaboratively built guide.
[00:42:01.400 --> 00:42:09.480] It's in its infancy, though it has already built up a hugely enthusiastic group of researchers pouring stuff into it.
[00:42:09.480 --> 00:42:18.560] Before we could even get a couple of steps along the way of building the kind of infrastructure we needed to make the thing start to self-organize and self-propagate, guess what?
[00:42:18.560 --> 00:42:23.440] Like every other website on the planet, we ran out of resources or money, as we call it.
[00:42:23.440 --> 00:42:35.680] I hope that one day we can begin to form the basis of something that brings my original vision of The Hitchhiker's Guide to the Galaxy as a mobile, personal, collaborative guide to real life.
[00:42:37.280 --> 00:42:43.680] Our AI Grok is modeled after The Hitchhiker's Guide to the Galaxy, which is one of my favorite books.
[00:42:43.680 --> 00:42:47.360] It's a book on philosophy, disguised as a book on humor.
[00:42:47.360 --> 00:43:02.800] Actually, that forms the basis of my philosophy, which is that we don't know the meaning of life, but the more we can expand the scope of scale of consciousness, digital and biological, the more we're able to understand what questions to ask about the answer.
[00:43:02.800 --> 00:43:04.720] That is the universe.
[00:43:05.040 --> 00:43:08.080] So I have a philosophy of curiosity.
[00:43:08.080 --> 00:43:10.160] But I'm afraid I lied.
[00:43:10.160 --> 00:43:13.360] We can't give Douglas the last word.
[00:43:13.360 --> 00:43:24.160] Because it turns out that someone else has come along with their own vision of what an internet based on The Hitchhiker's Guide to the Galaxy would look like.
[00:43:24.480 --> 00:43:31.440] There is no question that Elon Musk is a major figure in this moment of global history.
[00:43:31.440 --> 00:43:37.040] He has done more to speed the transition to electric cars than anyone alive.
[00:43:37.040 --> 00:43:44.480] And through SpaceX, he has pushed the boundaries of space exploration further than ever before.
[00:43:44.480 --> 00:43:52.400] If we actually get to hitchhike the galaxy in our lifetimes, it will largely be because of Elon Musk.
[00:43:52.720 --> 00:44:12.840] And yet, his reinvention of Twitter as an organ of right-wing politics, his championing and partnership with Donald Trump, and his reverse takeover and defenestration of the American government all give one, to put it as neutrally as possible, pause.
[00:44:13.160 --> 00:44:21.480] Now, on the one hand, it is hardly surprising that a tech bro like Elon is a fan of Hitchhiker.
[00:44:21.480 --> 00:44:25.160] But his admiration goes beyond the casual.
[00:44:25.160 --> 00:44:32.920] Elon Musk sent a plaque reading, Don't Panic, into space strapped onto his Tesla Roadster.
[00:44:32.920 --> 00:44:44.280] As he's just told us, he based his AI engine on the guide, and he plans to name one of his Mars settlement ships the Heart of Gold.
[00:44:44.280 --> 00:44:58.440] Even more than any of that, though, Elon talks about Douglas the same way I talk about Douglas, as a profound thinker whose contributions go beyond the realms of comedy.
[00:44:58.440 --> 00:45:04.760] So I find myself puzzling about what exactly Elon sees in Douglas Adams.
[00:45:04.760 --> 00:45:07.560] Is Elon seeing the same thing I see?
[00:45:07.880 --> 00:45:15.080] Here are some other points of view from other Douglas friends and fans, starting with James Goss.
[00:45:15.080 --> 00:45:19.880] If Douglas was alive today, he would have taken great joy in blocking Elon on Twitter.
[00:45:19.880 --> 00:45:26.360] That would have been the thunderclap that just resounds around the world, and it would have happened like three years before the rest of us got there.
[00:45:26.680 --> 00:45:35.000] Emma Westercoot is an associate professor of game design at OCAD, Canada's largest art and design university.
[00:45:35.000 --> 00:45:41.720] She's as thoughtful a person on gaming, social media, and interactivity as you will ever meet.
[00:45:41.720 --> 00:45:46.640] And she started her career working with Douglas at the Digital Village.
[00:45:47.200 --> 00:46:01.680] And yet, the sort of tech bros in Silicon Valley have taken it as a to-do list without actually having the reflexivity to sort of see that it may have been a critique or a warning.
[00:46:01.680 --> 00:46:03.760] So I think that's interesting.
[00:46:04.160 --> 00:46:15.360] I think that when Elon Musk sent Don't Panic Up Into Space or whatever he did, I heard Douglas sort of shift in his grave sort of thing.
[00:46:15.360 --> 00:46:22.080] Max Landis, the screenwriter and comic book author, is as big a Douglas fan as anyone alive.
[00:46:22.080 --> 00:46:25.760] We collaborated on the Gently Netflix series.
[00:46:25.760 --> 00:46:32.720] Max is himself a big and sometimes controversial character with a pretty healthy ego.
[00:46:32.720 --> 00:46:39.120] So I wondered what he would make of Elon sharing our love for Douglas.
[00:46:39.760 --> 00:46:45.040] Elon, what a fascinating evolution of a public figure.
[00:46:45.360 --> 00:46:51.040] Like what a transformation, an unself-conscious transformation into a villain.
[00:46:51.680 --> 00:47:12.720] And I think certain people are cushioned both by their neurological makeup and their parenting and the $400 billion behind them from having to look at the world in a realistic way.
[00:47:12.720 --> 00:47:19.600] And I think Douglas appeals to people who want to think they know a little better because Douglas writes like he knows a little better.
[00:47:19.600 --> 00:47:23.360] He writes like he kind of gets it a little more than you do.
[00:47:23.360 --> 00:47:36.760] However, a lot of times, what people miss, and when you talk about Musk, what I think someone like Musk misses is that ultimately, Hitchhiker's Guide, one anyway, is fucking about Arthur.
[00:47:37.080 --> 00:47:40.840] You're really locked in in his experience in that book.
[00:47:40.840 --> 00:47:46.120] It's about a normal guy who is a victim of systems and institutions.
[00:47:46.120 --> 00:47:55.720] And if you are someone like Elon Musk, who exists at the highest spectrum of human power, he does, he can have anything he wants.
[00:47:55.720 --> 00:47:58.760] I think you want to believe you deserve that position.
[00:47:58.760 --> 00:48:00.760] You have to believe you know better.
[00:48:00.760 --> 00:48:02.680] So I think, does that make sense?
[00:48:02.680 --> 00:48:07.880] It's almost like, it's almost like in his head, he turns Douglas Adams into Ann Rand.
[00:48:08.520 --> 00:48:14.440] The problem, I mean, the problem with Elon, if we're honest, is he's a Zaphod, but he thinks he's a Ford.
[00:48:14.440 --> 00:48:15.560] Yeah, it's real.
[00:48:16.520 --> 00:48:18.760] You know, when you meet a Ford.
[00:48:18.760 --> 00:48:22.520] So, like, it, and it's not Elon, Elon Zaphod.
[00:48:22.520 --> 00:48:23.480] He would have another head.
[00:48:23.480 --> 00:48:25.640] His kid is named like X23.
[00:48:25.640 --> 00:48:28.520] Like, he would do whatever it took.
[00:48:28.520 --> 00:48:40.600] And I believe if you showed Trump the machine that shows you your size in the universe, the you are here machine, he would have exactly the same reaction as Zaphod.
[00:48:40.600 --> 00:48:43.320] Ah, it zoomed in on me.
[00:48:43.640 --> 00:48:56.760] The image of Elon Musk and Donald Trump sitting in the total perspective vortex is perhaps the best way to give Douglas the last word in this chapter.
[00:49:04.680 --> 00:49:11.160] That was a preview of the new audiobook, Douglas Adams: The Ends of the Earth, from Pushkin Industries.
[00:49:11.160 --> 00:49:21.840] Get Douglas Adams The Ends of the Earth now at Audible, Spotify, pushkin.fm/slash audiobooks, or wherever audiobooks are sold.