Debug Information
Processing Details
- VTT File: 18dd2072b0838524ce61c7703383743a.vtt
- Model Used: models/gemini-2.5-flash-lite
- Processing Time: September 12, 2025 at 11:22 AM
- Total Chunks: 2
- Transcript Length: 103,930 characters
- Caption Count: 953 captions
- Temperature: 0.1
- Max Tokens: 1024
- Top-K: Not used
- Top-P: 0.95
- Candidate Count: 1
- Affiliate Tag: spokengoods-20
Prompts Used
Prompt 1: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 1 of 2 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
[00:00:00.160 --> 00:00:05.520] It really irked me when in 2024 most people were saying AI is here and now the PM job is gone.
[00:00:05.520 --> 00:00:10.720] There were others who were saying AI is here so now we can do the things we don't like to do with AI.
[00:00:10.720 --> 00:00:15.040] What do you find is most common across the companies that are succeeding?
[00:00:15.040 --> 00:00:21.280] Companies that recognize that AI is not this magic thing that you're gonna slather on to your product.
[00:00:21.280 --> 00:00:23.280] The problems are still the problem.
[00:00:23.280 --> 00:00:27.600] Something else that I think is also really important is getting super hands-on with the tech.
[00:00:27.600 --> 00:00:34.880] I've written more code in the last one year than I have in the last 10 years because code is now essentially architecture and English.
[00:00:34.880 --> 00:00:38.800] I'll pick a project that touches a lot of the things that I need to learn.
[00:00:38.800 --> 00:00:41.840] One of the passion projects I have is to automate my house.
[00:00:41.840 --> 00:00:44.880] The idea is that I give the house eyes and ears.
[00:00:44.880 --> 00:00:48.160] The coolest thing is I've specced out a super sensor.
[00:00:48.160 --> 00:00:50.400] It will see people, hear them.
[00:00:50.400 --> 00:00:55.680] It'll sense humidity and temperature and I'm building the hardware myself.
[00:00:56.960 --> 00:01:00.400] Today my guests are Aji and Ezine Udezwe.
[00:01:00.400 --> 00:01:05.920] They are married, both longtime product leaders with over 50 years of combined product experience.
[00:01:05.920 --> 00:01:16.400] They're also the authors of a new book that I love called Building Rocket Ships, Product Management for High Growth Companies, which is a synthesis of their biggest product lessons over the course of their entire career.
[00:01:16.400 --> 00:01:22.240] Aji was chief product officer at Calendly and Typeform and led product teams at Twitter, Atlassian, and Microsoft.
[00:01:22.240 --> 00:01:26.160] Ezine was CPO at WP Engine and VP of Product at ProCore.
[00:01:26.160 --> 00:01:29.840] We chat about what is changing in the role of product management.
[00:01:29.840 --> 00:01:31.600] Also, what is staying the same?
[00:01:31.600 --> 00:01:44.640] We also get into the shift of PM to edge ratios that AI is introducing, the five skills becoming most important in being a successful PM over the coming years, and the single biggest lesson that each of them have learned over the course of their careers.
[00:01:44.640 --> 00:01:48.880] If you enjoy this podcast, don't forget to subscribe and follow it in your favorite podcasting app or YouTube.
[00:01:48.880 --> 00:01:50.720] It helps tremendously.
[00:01:50.720 --> 00:02:05.320] And if you become an annual subscriber of my newsletter, you get a year free of 15 incredible products, including Lovable, Replid, Bolt, NADN, Linear, Superhuman, Descript, Whisperflow, Gamma, Perplexity, Warp, Granola, Magic Patterns, Raycast, Chat PRD, and Mobbin.
[00:01:59.920 --> 00:02:08.920] Head over to lennysnewsletter.com and click product pass.
[00:02:08.920 --> 00:02:13.000] With that, I bring you Aji and Ezine Odeswe.
[00:02:13.000 --> 00:02:15.160] This episode is brought to you by Mercury.
[00:02:15.160 --> 00:02:17.400] I've been banking with Mercury for years.
[00:02:17.400 --> 00:02:20.760] And honestly, I can't imagine banking any other way at this point.
[00:02:20.760 --> 00:02:24.280] I switched from Chase, and holy moly, what a difference.
[00:02:24.280 --> 00:02:30.920] Sending wires, tracking spend, giving people on my team access to move money around, so freaking easy.
[00:02:30.920 --> 00:02:39.320] Where most traditional banking websites and apps are clunky and hard to use, Mercury is meticulously designed to be an intuitive and simple experience.
[00:02:39.320 --> 00:02:48.600] And Mercury brings all the ways that you use money into a single product, including credit cards, invoicing, bill pay, reimbursements for your teammates, and capital.
[00:02:48.600 --> 00:03:05.240] Whether you're a funded tech startup looking for ways to pay contractors and earn yield on your idle cash, or an agency that needs to invoice customers and keep them current, or an e-commerce brand that needs to stay on top of cash flow and excess capital, Mercury can be tailored to help your business perform at its highest level.
[00:03:05.240 --> 00:03:09.240] See what over 200,000 entrepreneurs love about Mercury.
[00:03:09.240 --> 00:03:12.920] Visit Mercury.com to apply online in 10 minutes.
[00:03:12.920 --> 00:03:18.600] Mercury is a fintech, not a bank, banking services provided through Mercury's FDIC insured partner banks.
[00:03:18.600 --> 00:03:21.000] For more details, check out the show notes.
[00:03:21.000 --> 00:03:27.160] My podcast guests and I love talking about craft and taste and agency and product market fit.
[00:03:27.160 --> 00:03:28.840] You know what we don't love talking about?
[00:03:28.840 --> 00:03:29.720] SOC2.
[00:03:29.960 --> 00:03:31.400] That's where Vanta comes in.
[00:03:31.400 --> 00:03:39.320] Vanta helps companies of all sizes get compliant fast and stay that way with industry-leading AI, automation, and continuous monitoring.
[00:03:39.320 --> 00:03:49.760] Whether you're a startup tackling your first SOC 2 or ISO 27001 or an enterprise managing vendor risk, Vanta's trust management platform makes it quicker, easier, and more scalable.
[00:03:50.080 --> 00:03:56.560] Vanta also helps you complete security questionnaires up to five times faster so that you can win bigger deals sooner.
[00:03:56.560 --> 00:04:05.920] The result, according to a recent IDC study, Vanta customers slashed over $500,000 a year and are three times more productive.
[00:04:05.920 --> 00:04:08.320] Establishing trust isn't optional.
[00:04:08.320 --> 00:04:10.400] Vanta makes it automatic.
[00:04:10.400 --> 00:04:15.120] Get $1,000 off at Vanta.com/slash Lenny.
[00:04:18.000 --> 00:04:22.480] Aji and Ezine, thank you so much for being here and welcome to the podcast.
[00:04:22.480 --> 00:04:24.400] Thank you for having us, Lenny.
[00:04:24.400 --> 00:04:26.240] It's absolutely my pleasure.
[00:04:26.240 --> 00:04:29.040] You guys are a rare guest duo.
[00:04:29.040 --> 00:04:30.320] You two are married.
[00:04:30.320 --> 00:04:32.400] You're both longtime product leaders.
[00:04:32.400 --> 00:04:36.080] You've been in product combined for over 50 years.
[00:04:36.080 --> 00:04:43.120] The cool thing about the fact that you guys have been in product for so long is that you've seen a lot of change in the role of a PM and the role of building products.
[00:04:43.120 --> 00:04:45.360] You've also seen what doesn't change.
[00:04:45.360 --> 00:04:49.440] And so let me actually start here, and I'll throw this to Ezina.
[00:04:49.440 --> 00:04:56.720] What are you seeing changing most in the role and day-to-day job of a product manager?
[00:04:56.720 --> 00:04:58.800] And what are you seeing staying the same?
[00:04:58.800 --> 00:05:06.000] At a high level, the core value that a PM has to deliver hasn't changed.
[00:05:06.000 --> 00:05:17.040] It really is de-risking the product delivery process while also really trying to maximize the value the business gets from these investments.
[00:05:17.040 --> 00:05:35.080] But tactically, I think that the good thing is that we're finding out that PMs are freed up more, and as a result, they can actually invest more in true developing true insights about their customer.
[00:05:35.400 --> 00:05:47.800] So you're seeing or you're finding out your engineering leaders, engineering team is they're asking you to really confirm your insight and your instinct about the customer.
[00:05:47.800 --> 00:05:56.520] The second thing I would say is that there's a need to orchestrate a little differently.
[00:05:56.520 --> 00:06:02.120] So in the past it was around people orchestrating, people getting into the room, et cetera.
[00:06:02.120 --> 00:06:04.520] But now there are many moving pieces.
[00:06:04.920 --> 00:06:19.480] There's the software, there are also the feedback loops, the LLMs that are the multiple touch points that you have in technology, ensuring that those are also being built in a way that allows for true learning within your software.
[00:06:19.480 --> 00:06:22.760] So that idea of working static software isn't true anymore.
[00:06:22.760 --> 00:06:29.560] It's almost like living, breathing software, and ensuring that the things that allow it to breathe actually occur.
[00:06:29.560 --> 00:06:39.400] The other piece, two other pieces, data literacy, an understanding of what it is that your product is learning from.
[00:06:40.440 --> 00:06:43.800] Where is the data going and how is it being used?
[00:06:43.800 --> 00:06:52.760] How is it being organized so that it could be leveraged for future insight about your product use and customer use?
[00:06:52.760 --> 00:07:03.560] And then the last I would say is just creating the right guardrails within your product to ensure, that it actually continues to do what it needs to do.
[00:07:03.560 --> 00:07:08.440] You know, we spoke earlier, and I have a whole thing about ethics of AI, et cetera.
[00:07:08.440 --> 00:07:12.840] And I think that we have a responsibility as product people.
[00:07:12.840 --> 00:07:19.120] And I'm seeing PMs ask the questions that we didn't get to see them ask when we were building social media.
[00:07:14.840 --> 00:07:21.040] So, those are the kind of things I would say.
[00:07:21.360 --> 00:07:23.200] I would definitely want to come back to that topic.
[00:07:23.200 --> 00:07:29.760] There's something you just talked about that connects so deeply with something a recent guest talked about, Asha Sharma.
[00:07:29.760 --> 00:07:45.680] She's a chief vice president at Microsoft for Gandhi AI, and she had this concept that products are evolving from product as artifact, where it's this like done thing you ship to a product as an organism that continues to evolve based on data that it's feeding and this metabolism of data.
[00:07:46.000 --> 00:08:00.480] In our book, you'll see we talk about systems, and I have this innate passion around the idea of what a system is, and it has to do with that living, breathing, and interaction with every other thing.
[00:08:00.480 --> 00:08:02.480] So, yes, yes, yes, and yes.
[00:08:02.800 --> 00:08:05.440] Let me throw over to Aji this first point you made.
[00:08:05.760 --> 00:08:16.400] I love this idea that PMs, there's more time spent almost at kind of the top of the funnel, the top of the product development lifecycle, confirming insights from customer data, deciding what to build.
[00:08:16.560 --> 00:08:26.240] There's talk recently of just PMs almost becoming the bottleneck for teams because engineers are moving so much faster and PMs are the same, just like sitting, maybe docs are easier to write.
[00:08:27.280 --> 00:08:28.320] What do you see there?
[00:08:28.320 --> 00:08:37.520] What do you think about just the, I don't know, this ratio shift potentially for PMs almost become a bottleneck with AI accelerating other functions?
[00:08:37.840 --> 00:08:43.280] For the last 20 years, we've had like very fixed ratios in terms of how what gets done.
[00:08:43.280 --> 00:08:48.320] Like, PM takes this amount of time, developers take this amount of time, go to market takes amount of time.
[00:08:48.320 --> 00:08:58.480] And I think that contract is basically exploding because the build process, you know, sort of the whole solutionary process of the build process is accelerating so fast.
[00:08:58.480 --> 00:09:05.320] And so a company we worked with told us that, you know, they're sort of like a build contract build company.
[00:09:05.640 --> 00:09:12.680] They said they would get off the phone from a pitch, and in four hours they would have a prototype.
[00:09:12.680 --> 00:09:15.320] And so the question is, like, what is a PM doing?
[00:09:15.320 --> 00:09:20.840] The person listening to that is half PM, half developer, in order to be able to produce that.
[00:09:20.840 --> 00:09:26.200] And so what we're seeing is that the way, you know, PMs do three things.
[00:09:26.200 --> 00:09:28.840] We think about what the customers want that is profitable.
[00:09:28.840 --> 00:09:33.320] So working on sharp problems and defining sharp problems and what's in the customer's mind and makes it sharp.
[00:09:33.320 --> 00:09:36.120] We support the solutioning and the build of it.
[00:09:36.120 --> 00:09:40.760] And then we don't do this enough, but we need to support the go-to-market of it.
[00:09:40.760 --> 00:09:45.400] And so what's happening is that this middle thing is becoming more capable and much faster.
[00:09:45.400 --> 00:09:48.120] And so the way we support it, it has to change.
[00:09:48.360 --> 00:09:52.200] PRD writing is insufficient at the moment.
[00:09:52.840 --> 00:09:59.720] You know, sort of like the static way we ask questions about what's in the customers' minds is like that stuff needs to be like faster.
[00:09:59.720 --> 00:10:01.640] The cycle time is so fast now.
[00:10:01.640 --> 00:10:03.240] And so PMs need to adapt.
[00:10:03.240 --> 00:10:07.960] And that needs new tool sets, new skill sets, and so on and so forth.
[00:10:07.960 --> 00:10:10.680] I think that's where the most pressure is.
[00:10:10.680 --> 00:10:19.640] But at the same time, like Azina said, PMs can take some of the, if they adapt to this really well, they can spend a lot more time here, right?
[00:10:19.640 --> 00:10:22.120] And that's where there's a lot of opportunity.
[00:10:22.120 --> 00:10:28.840] And, you know, the thing that we want PMs to do is not be the crouch, not worry about people saying, oh, you're the bottleneck.
[00:10:28.840 --> 00:10:35.880] No, don't be the bottleneck, first of all, but also since you see everything, add value in other places where traditionally you ever spend time.
[00:10:35.880 --> 00:10:38.360] I want to pull on this sharp problem thread.
[00:10:38.360 --> 00:10:40.360] This is my favorite Audi meme.
[00:10:40.360 --> 00:10:43.560] We talked about this a bit when you visited the podcast last.
[00:10:43.560 --> 00:10:45.680] I think it's such a useful concept.
[00:10:45.680 --> 00:10:48.400] So I just want to make sure people learn this while we're talking about it.
[00:10:44.840 --> 00:10:51.040] Talk about what a sharp problem is and why that's important.
[00:10:51.360 --> 00:11:05.040] One of the things I learned from failing a lot at building companies is that sometimes, you know, the way that YC other people pitch it is that you build something and if it doesn't work, you pivot, da-da-da, you know, stuff like that.
[00:11:05.040 --> 00:11:19.120] But it turns out that in the world of high technology and the world of figuring out what customers need, there are many shortcuts that mean that pivoting isn't just like a habit, something you walk around drunkenly doing, right?
[00:11:19.120 --> 00:11:24.240] A lot of the most successful people, you know, Bill Gates, Facebook, and so they didn't even pivot.
[00:11:24.240 --> 00:11:28.000] They had an idea, but it turned out that the idea was a sharp problem.
[00:11:28.000 --> 00:11:39.360] So the way to avoid drunken startup building is pick things that are old.
[00:11:40.240 --> 00:11:42.080] They're core needs.
[00:11:42.080 --> 00:11:48.880] And the way you profit is when you reimagine old needs in new technological ways.
[00:11:48.880 --> 00:11:50.160] And that's important.
[00:11:50.160 --> 00:11:53.200] Now, let's frame it in the sharp problem.
[00:11:53.200 --> 00:11:57.200] What do people feel they need help with?
[00:11:57.200 --> 00:11:58.720] What is still difficult?
[00:11:58.720 --> 00:12:12.560] To the point that if you improve it three to five times, maybe 10x, or you take cost out of it, like 10x to cost, right, one tenth, they'll be like, this is so compelling, you should take my money now.
[00:12:12.560 --> 00:12:16.320] And that's what Lisbon should really, really focus on if they can.
[00:12:16.640 --> 00:12:28.480] There's extended stuff we've talked about over the years, like the Unicorn framework, which is like, if you're building for B2B specifically, how do you draw a quadrant around the problem so you know if it's highly frequent?
[00:12:28.480 --> 00:12:32.040] Because frequency also means a lot of pain and figure it out.
[00:12:29.760 --> 00:12:33.960] But that's something people can read on the substar.
[00:12:34.200 --> 00:12:37.880] Yeah, and we talked about that in our last chat, so I'll point people to that.
[00:12:37.880 --> 00:12:54.040] Okay, so building on this idea of being a drunken walk of figuring out what to build, you have this concept of the shipyard, and this is kind of a way to think about how to think about what product development might look like going forward and just a model for how to work in this chaotic world that we're in.
[00:12:54.040 --> 00:12:57.400] Talk about what that looks like in the advice here.
[00:12:57.400 --> 00:12:59.080] I'm going to give some credit.
[00:12:59.080 --> 00:13:07.080] When I was at Lassian, the first time I encountered the idea was with Jeff Redford, he's now like a VC, and he talked about it.
[00:13:07.080 --> 00:13:12.200] But you know, I've taken the idea because I loved it to maybe a whole different level, maybe it says a clarity.
[00:13:12.200 --> 00:13:15.880] A shipyard is to evoke controlled chaos.
[00:13:15.880 --> 00:13:30.120] If you go to a shipyard at peak, you know, Long Beach or whatever it is, there's so much going on and like big heavy machinery moving around, trucks and containers from China, from Malaysia jumping around.
[00:13:30.120 --> 00:13:38.280] But this chaos underlying it is careful communication, high skill, right?
[00:13:38.520 --> 00:13:46.280] So, what looks like you know, Brownian motion is actually orchestrated progress.
[00:13:46.280 --> 00:13:54.760] And, you know, these are these shipyards at the center of GDP for a lot of nations because this is where trade happens.
[00:13:54.760 --> 00:13:58.040] So, this is what it's supposed to evoke: controlled chaos.
[00:13:58.040 --> 00:14:01.720] And so, what a shipyard team looks like is a six-person team.
[00:14:01.720 --> 00:14:14.120] It is PM, it is engineering, it is design, it is user research, it is data, ML, AI expertise, AI PMs, or whatever you call it.
[00:14:14.120 --> 00:14:19.280] It is product marketing in a pod, problem solving.
[00:14:19.280 --> 00:14:25.520] You know, in this, especially in the AI age, the idea of stand-ups isn't really even make as much sense, right?
[00:14:25.520 --> 00:14:33.600] These people should be, you know, communicated and collaborating hourly basically to solve the problem because especially they're working in new problem states.
[00:14:33.600 --> 00:14:35.840] So I think that's what the core shipyard team means.
[00:14:35.840 --> 00:14:44.000] And then there are these tendrils to, you know, we call salespeople, customer success people, the nervous, sort of the skin.
[00:14:44.000 --> 00:14:47.040] Like, think about the shipyard team as the brain, but this is the skin.
[00:14:47.040 --> 00:14:49.120] This is how you feel customers.
[00:14:49.120 --> 00:14:52.240] And so here's a concrete example.
[00:14:52.560 --> 00:14:59.360] Before I shipped anything at Caldly, I did the design review with a support manager.
[00:14:59.600 --> 00:15:09.520] The support manager had to be in the room because the design people and engineering people, as much as we cared, we shipped problems all the time, you know, featured that.
[00:15:09.520 --> 00:15:12.400] Where the support manager looks at it and is like, that's not going to work.
[00:15:12.400 --> 00:15:16.640] I've had a thousand conversations and it was so useful.
[00:15:16.640 --> 00:15:21.440] And so having the shipyard team have the tendrils of connection to customer teams was so important.
[00:15:21.440 --> 00:15:24.000] So what I'm hearing is essentially embrace chaos.
[00:15:24.000 --> 00:15:25.600] Things are going to get only weirder.
[00:15:25.600 --> 00:15:31.520] The something, I forget who, maybe the CPO at OpenAI said just like it's only going to get weirder.
[00:15:31.520 --> 00:15:34.560] This is the least, this is the most normal the world will be.
[00:15:35.440 --> 00:15:35.680] Yep.
[00:15:36.000 --> 00:15:49.840] And that's important because if the world is weird and you want to make sense of it, you need people who are highly motivated, have the right skill sets to continuously create it as it moves, as it gets weird.
[00:15:49.840 --> 00:15:57.520] Audi, you said EP, I remember Lenny asked how many, you know, what is what is shipyard comprised of?
[00:15:57.520 --> 00:15:59.360] He said six people person team.
[00:15:59.360 --> 00:16:02.760] I know that's going to come up, and people are going to go, sorry, maybe why?
[00:16:03.400 --> 00:16:04.520] It's six person.
[00:16:04.680 --> 00:16:06.200] Six capability team.
[00:15:59.920 --> 00:16:06.280] Okay.
[00:16:06.760 --> 00:16:08.200] It is not a person thing.
[00:16:08.520 --> 00:16:11.480] It is engineering, however many you need.
[00:16:11.480 --> 00:16:13.080] Again, it's all about admission.
[00:16:13.080 --> 00:16:14.840] It's PM, it's design.
[00:16:15.160 --> 00:16:17.640] It's the capabilities I'm optimizing for, right?
[00:16:18.040 --> 00:16:22.840] Not that data and ML, it's product marketing and it's a user research.
[00:16:23.000 --> 00:16:24.200] PMs can't do it.
[00:16:24.200 --> 00:16:25.640] You need the capabilities.
[00:16:25.640 --> 00:16:32.440] And whether it's a fractional person within Matrix into that EPD team or 10 people because of developers, that's what you need.
[00:16:32.440 --> 00:16:32.680] Yeah.
[00:16:32.680 --> 00:16:32.920] Yeah.
[00:16:32.920 --> 00:16:36.680] So what I'm hearing there is roles will become blurrier.
[00:16:36.680 --> 00:16:42.200] That PMs need to do more design, engineers need to do more PMing, people need more data skills.
[00:16:42.600 --> 00:16:48.040] So to me, those are the two takeaways: is just learn to be more comfortable with chaotic world.
[00:16:48.200 --> 00:16:51.480] Just AI constants, every week there's like, oh, this is changing everything.
[00:16:51.480 --> 00:16:53.160] Rethink the roadmap.
[00:16:53.160 --> 00:16:56.760] And then also the roles will be less divided.
[00:16:56.760 --> 00:17:00.520] People will need to learn to do many roles and chip in.
[00:17:00.520 --> 00:17:00.920] Yes.
[00:17:00.920 --> 00:17:01.720] Awesome.
[00:17:01.720 --> 00:17:19.320] So building on that, Ezine, when you hire PMs these days, when you're looking for product leaders, what do you, is there something you're looking for more and more because you believe this will be more important to be successful in a world where AI is just changing everything?
[00:17:19.640 --> 00:17:24.040] I think OG started, he laid the foundation when he said it's day one.
[00:17:24.360 --> 00:17:32.840] And when you're working and thinking about building product in this type of era of space, it's important that you remember that.
[00:17:32.840 --> 00:17:41.960] So, I'm going to talk about what it is that I typically will look for from an attitude's values, and behaviors point of view, as well as a skills point of view.
[00:17:41.960 --> 00:17:49.120] I'm currently working with a firm in LA and I'm building curriculum for their PMs as well as helping them recruit.
[00:17:49.440 --> 00:17:56.160] And the things that we've honed in on are curiosity.
[00:17:56.160 --> 00:18:01.840] And I know it's been said over and over again, but it matters more now.
[00:18:01.840 --> 00:18:12.480] It is, is this person humble and curious enough to admit they don't know and are willing to start as a learner?
[00:18:12.480 --> 00:18:21.920] Do they truly have a growth mindset where they are okay being taught by someone else, no matter how senior they have become?
[00:18:21.920 --> 00:18:23.120] That is very important.
[00:18:23.120 --> 00:18:26.640] That humility and curiosity mixed together.
[00:18:26.640 --> 00:18:30.800] The second part is what I'd call agency.
[00:18:30.800 --> 00:18:41.440] Being able to see opportunity and not ask others for permission to go execute on that opportunity, but actually take the opportunity themselves.
[00:18:41.440 --> 00:18:46.000] Have a strong, there's another word most people use for this.
[00:18:46.320 --> 00:18:49.200] It's not coming to me, but just take a bull.
[00:18:49.360 --> 00:18:49.920] Pull by the horn.
[00:18:49.920 --> 00:18:52.720] But no, there's another characteristic value, behavior.
[00:18:53.200 --> 00:19:00.640] But high agency, being able to just move forward and feel, act like an owner, right?
[00:19:00.960 --> 00:19:02.960] Ownership, that's the word.
[00:19:03.280 --> 00:19:04.800] High ownership.
[00:19:05.280 --> 00:19:16.560] But I think ownership is seeing that you can run, but I think agency assumes that you control, you're more of a thermostat than a thermometer.
[00:19:16.560 --> 00:19:23.680] And that means that, hey, the thermometer is always measuring the temperature of the room, while a thermostat will change the temperature of the room.
[00:19:23.680 --> 00:19:26.080] So I'm looking for that in people.
[00:19:26.080 --> 00:19:29.840] And I want people who can move from a position of strength.
[00:19:30.200 --> 00:19:36.360] It really irked me when in 2024, most people were saying AI is here, and now the PM job is gone.
[00:19:36.360 --> 00:19:45.000] And there were others who were saying AI is here, so now we can do the things we don't like to do with AI and then go focus on the things that we need to do.
[00:19:45.000 --> 00:19:49.080] So people who spoke that way are the types of people I want on my team.
[00:19:49.080 --> 00:19:54.920] From a skills perspective, obviously, an understanding of data, right?
[00:19:54.920 --> 00:20:01.080] Just a high level of how is it organized, how is it leveraged in this new world of AI.
[00:20:01.400 --> 00:20:04.920] There's this skill of being able to write evals.
[00:20:05.080 --> 00:20:14.360] I know everybody can write prompts, prompt engineering, you can try and focus the LLM so that it can offer better insights and offer better results.
[00:20:14.360 --> 00:20:26.520] But even as your LM actually produces results, you need to verify that it actually is smart and it hasn't hallucinated or gone off by creating true evals, right?
[00:20:26.840 --> 00:20:28.440] Those are some hardcore skills.
[00:20:28.440 --> 00:20:41.560] Being able to constrain hallucination, knowing what ways you do that, being able to use multiple models and figure out what is good about this model versus that.
[00:20:41.880 --> 00:20:44.920] One of the things I love to talk about is what does intelligence mean?
[00:20:44.920 --> 00:20:48.120] Intelligence, often people say, is connecting dots.
[00:20:48.120 --> 00:21:00.360] And the beauty of what it is that we have an opportunity to do is that with AI, you can find, you can train it so that it's very specific to a very particular task, right?
[00:21:00.360 --> 00:21:02.760] It can become really smart in one area.
[00:21:02.760 --> 00:21:15.520] And the idea now is: how do you combine the smarts of what maybe Claude does, right, with something that GPT does better and make something even greater than each of them?
[00:21:16.000 --> 00:21:19.600] So that's one of the things I'm looking for in a PM.
[00:21:19.600 --> 00:21:27.920] And then being able to tweak for best for performance, whether it's choosing, whether it's just fine-tuning, doing the weight adjustments.
[00:21:27.920 --> 00:21:29.760] Oji is very passionate about this one.
[00:21:29.760 --> 00:21:31.360] We talk about it quite a bit.
[00:21:31.840 --> 00:21:38.960] But that's the attitudes point of view where, you know, just strong ownership, strong agency, curiosity, being humble.
[00:21:38.960 --> 00:21:47.280] And then this idea of understanding beyond prompts, like, you know, really knowing that AI is just a toolbox, right?
[00:21:47.280 --> 00:21:49.040] And it can make major mistakes.
[00:21:49.040 --> 00:22:03.520] And how do you constrain it such that constrain it, optimize it, tweak it so that it can actually work on your behalf and not like it, you need to make it work for you versus being lost in all the things that it's able to do.
[00:22:03.520 --> 00:22:12.400] I love this answer, and it resonates so well with so many other people's advice, which is curiosity, skills that matter most.
[00:22:12.400 --> 00:22:17.920] And this actually curiosity comes up a lot when I ask people what skills are you focusing on teaching your kids.
[00:22:18.240 --> 00:22:22.320] The number one skill people mention is curiosity.
[00:22:22.320 --> 00:22:23.840] I love that you included humbleness.
[00:22:23.840 --> 00:22:24.880] I want to come back to that.
[00:22:24.880 --> 00:22:34.400] So essentially, curiosity, humbleness, agency, such a meme these days and what PMs need to build and be strong at high agency.
[00:22:34.400 --> 00:22:35.680] So great.
[00:22:35.680 --> 00:22:37.520] And then evals, I like that a lot.
[00:22:37.520 --> 00:22:38.640] That's also super common.
[00:22:38.640 --> 00:22:46.240] I have a really cool guest post coming out really soon that'll teach you more than you ever thought you needed to know about how to become really good at evals.
[00:22:46.240 --> 00:23:02.280] But I want to double down on something you just said, which I love so much, which is this idea of instead of waiting to see what jobs AI takes, it's almost like the way you phrased it essentially: is what jobs can you take as a PM?
[00:23:02.280 --> 00:23:03.240] What can you do?
[00:23:03.240 --> 00:23:04.360] Can you do more engineering?
[00:23:04.360 --> 00:23:05.800] Can you get more design?
[00:23:05.800 --> 00:23:07.640] Such an empowering way to think about it.
[00:23:08.280 --> 00:23:10.600] Let me talk about something that's really personal.
[00:23:11.240 --> 00:23:21.560] I'm part of the women in product organization, and I was asked to guess right end of last year because there was a lot of fear in the community.
[00:23:21.560 --> 00:23:25.640] And there was this thing called the rise of the CTPO.
[00:23:25.640 --> 00:23:37.880] It's this thing, it's been in the background there, where I think many venture VCs, many PEs were asking, do I need a CPO as well as a CTO?
[00:23:37.880 --> 00:23:38.440] Right?
[00:23:38.760 --> 00:23:49.880] And I was always amazed at how many people were leaned back in this thing versus actually showing and proving the value of what a CPO could be.
[00:23:49.880 --> 00:23:58.120] So in many places, it is right that maybe you do only want one person, particularly when it's very well-known spaces, right?
[00:23:58.120 --> 00:24:01.240] It's clear, they're not doing anything revolutionary.
[00:24:01.240 --> 00:24:05.240] But it's like saying an engineer is the same as a product person when you think about it that way.
[00:24:05.240 --> 00:24:14.920] There is a conflict between the CPO and the CTO that actually drives innovation, in my opinion, because one person is trying to optimize the resourcing, the fixed, right?
[00:24:14.920 --> 00:24:21.800] While another person is actually trying to field opportunities and figure out how to wait and decide what is best to do it.
[00:24:21.800 --> 00:24:24.120] At the higher level, those are their jobs.
[00:24:24.120 --> 00:24:33.160] And that tension is really good because you need them to sharpen each other as you try to maximize the value of your organization, of the organization.
[00:24:33.160 --> 00:24:51.840] I say all this to say that there are folks who are not stopping to figure out how do we make the pie bigger, but they're looking at instead how do we divide this pie as it gets smaller, versus asking the question of Rimi, what is the opportunity to increase this for all of us?
[00:24:51.840 --> 00:24:54.880] And that is a skill set I also look out for.
[00:24:54.880 --> 00:24:55.280] Awesome.
[00:24:55.600 --> 00:24:58.800] I'm going to go to Yaji in a second, but I want to come back on this humble idea.
[00:24:58.800 --> 00:24:59.840] I haven't heard this before.
[00:25:00.560 --> 00:25:03.600] Why do you find it really important for people to be humble?
[00:25:03.600 --> 00:25:05.920] I haven't heard that piece of advice yet.
[00:25:05.920 --> 00:25:07.520] Oh my gosh.
[00:25:07.840 --> 00:25:11.120] Particularly now, it is really important to be.
[00:25:11.520 --> 00:25:14.480] Because I would say, okay, I went to school.
[00:25:14.480 --> 00:25:18.640] I actually did neural networks in school when I was doing a project.
[00:25:18.640 --> 00:25:24.800] And, you know, you get to this part of your career where you go, I kind of know a lot of what product is.
[00:25:24.800 --> 00:25:38.720] And I do see folks who are not willing to go sign up for that new junior AI PM course because they're CPO or they're head of product or they're X, Y, or Z, or they're a principal PM now.
[00:25:39.040 --> 00:25:40.560] And I think that that's a mistake.
[00:25:40.560 --> 00:25:43.760] I think that it's important to realize that you don't know everything.
[00:25:43.760 --> 00:25:45.440] You can't know everything.
[00:25:45.440 --> 00:25:59.920] And even as a leader, CPO, head of product, et cetera, I come from this school where I believe that in order to lead an organization, sometimes you actually do need to know how to do the work.
[00:25:59.920 --> 00:26:02.000] And I think I lead better that way.
[00:26:02.000 --> 00:26:07.920] So being able to sign up and say, you know what, let me sit in and figure out how we're going to build this thing.
[00:26:07.920 --> 00:26:17.360] And number one, learn for yourself, but also figure out what questions other PMs are asking in that course, right?
[00:26:17.360 --> 00:26:31.000] So it's about taking courses from people you may have considered junior, figuring out and sitting and reading books that you thought you, you know, just going back to the day one of how do you build product in an AI world.
[00:26:31.000 --> 00:26:36.360] So that's where humility comes to play in this period where we are.
[00:26:36.360 --> 00:26:42.680] It's you just don't know everything, can't know everything, and things will change faster than you're able to stay ahead of.
[00:26:42.680 --> 00:26:45.800] So lean on your team as well.
[00:26:45.800 --> 00:26:48.360] I just had this guy, Chip Connolly, on the podcast.
[00:26:48.360 --> 00:26:49.720] He's in his 60s.
[00:26:49.720 --> 00:26:53.560] He joined Airbnb when he was 52 as an intern.
[00:26:53.560 --> 00:26:57.480] And he said exactly the same thing to be successful.
[00:26:57.480 --> 00:27:03.880] And it's interesting as a metaphor, as an older person, you need to be really good at being open to learning from people younger than you.
[00:27:03.880 --> 00:27:16.280] And that's a really interesting, I think, lens and heuristic almost for just people that aren't as AI savvy yet, just leaning into that idea of just like, okay, I can actually learn a lot from people around me, even if they're much younger than me.
[00:27:16.280 --> 00:27:32.360] Okay, so Aji, something else that I think that you're really good at and you've been spending time on that I think is also really important is getting super hands-on with the tech, not just like reading and not just listening to podcasts like this, not just reading newsletters, but like actually building stuff.
[00:27:32.360 --> 00:27:40.040] Talk about just like why that's important, what you've been doing to learn and to stay ahead and not just be like a pontificate or guy in the clouds.
[00:27:40.040 --> 00:27:42.760] I just love everything we've said in the last few minutes.
[00:27:43.640 --> 00:27:48.600] I think the shorthand I have for humility is humility is teachability.
[00:27:49.480 --> 00:27:52.200] And when do you need teachability?
[00:27:52.200 --> 00:28:01.240] You need teachability when everything has changed, where there is no blueprint.
[00:28:01.240 --> 00:28:09.800] And whatever you think is a blueprint of AI in the last few years, you know, whether, oh, wow, Lavable is so successful, the Red Player is so successful.
[00:28:09.800 --> 00:28:14.120] It's actually quite possible that any one of those companies won't be around in two, three years, right?
[00:28:14.120 --> 00:28:18.400] Because we're still writing the playbook for what success looks like.
[00:28:18.720 --> 00:28:24.320] So humility is teachability, and teachability is survivability if you're thinking about careers.
[00:28:24.320 --> 00:28:29.280] And if you also, if you're thinking about building something that's important, it's also important like that.
[00:28:29.680 --> 00:28:38.160] In terms of how to keep your hand sharp, what I would say is a couple of things.
[00:28:38.160 --> 00:28:49.600] One is, I have written, Isaiah and I both have a master's engineering, but we very quickly figured out that the thing that we were gifted at was helping to invent the future.
[00:28:49.600 --> 00:28:53.280] Where customers are thinking, you know, how do we work with developers to do that?
[00:28:53.280 --> 00:28:55.760] And so we've been PMs forever.
[00:28:55.760 --> 00:28:59.440] And being a PM means that even though we could, we don't write code.
[00:28:59.440 --> 00:29:01.920] We work with developers, we work with other people in the market.
[00:29:01.920 --> 00:29:12.720] But I've written more code in the last one year than I have in the last 10 years because code is now essentially architecture and English or whatever language that you have.
[00:29:12.720 --> 00:29:13.840] And so what does that mean?
[00:29:13.840 --> 00:29:16.880] I've taken the opportunity to write more code.
[00:29:16.880 --> 00:29:20.960] I've taken the opportunity to convert PRD writing into prototype writing.
[00:29:20.960 --> 00:29:28.400] I've taken the opportunity to write more API interfaces and actually call those API interfaces with Postman myself.
[00:29:28.400 --> 00:29:39.280] I've taken the opportunity to subscribe to all the AI tools, right, and take that as a cost of learning so that I can understand what each model is useful for.
[00:29:39.280 --> 00:29:44.560] I've taken the opportunity to understand MCPs and connect different things to it.
[00:29:44.560 --> 00:29:54.080] Like right now, one of my pet projects, or one of the things I do in order to learn, is I pick a project that it's almost like writing an eval, what's an eval for my life to learn.
[00:29:54.240 --> 00:30:01.720] I'll take a pick a project that touches a lot of the things that I need to learn that I'm interested and passionate about.
[00:29:59.120 --> 00:30:05.560] And as I get into it, because of my passion, I start to learn more and more.
[00:30:05.880 --> 00:30:10.760] And so, one of the passionate purposes I've had I have is to automate my house.
[00:30:10.760 --> 00:30:23.240] So, I've been doing this for years and years, but now I'm trying to build a very smart house run on a processor that has AI chip and inference in it while the house looks dumb.
[00:30:23.240 --> 00:30:25.400] So, because I don't want anyone to know about it.
[00:30:25.400 --> 00:30:26.520] And so, what am I doing?
[00:30:26.520 --> 00:30:29.160] I'm getting models.
[00:30:29.160 --> 00:30:33.960] I am learning about how to, what quantization of those models means.
[00:30:33.960 --> 00:30:36.600] I am learning how to fine-tune some of them.
[00:30:36.600 --> 00:30:37.320] I'm feeding them.
[00:30:37.320 --> 00:30:41.720] I'm building an MCP server in my house that talks to the home entities.
[00:30:41.720 --> 00:30:47.720] And then I can collaborate with Claude on how to build and make it sort of super dynamic.
[00:30:48.040 --> 00:31:00.920] But beyond that, you know, one of the things we're spending time on is Advent Studio, I'm raising capital for Adventure Studio because that's what I want to spend the next decade on building seminar AI companies beyond the book.
[00:31:00.920 --> 00:31:11.240] I'm collaborating with some of the smartest people in AI right now who can teach me how to do this, how to write agents, what agents really mean.
[00:31:11.560 --> 00:31:21.640] And so, everything, as I said, writing evals, being humble, going back after 25 years of being a PM to being teachable is how I stay sharp.
[00:31:21.640 --> 00:31:23.400] I want to go back to this house.
[00:31:24.200 --> 00:31:25.720] What is your house doing?
[00:31:26.120 --> 00:31:27.880] How smart is this house?
[00:31:28.440 --> 00:31:41.880] The idea is that I give the house eyes and ears, and it senses the coolest thing is I've specced out what I call like a super sensor.
[00:31:41.880 --> 00:31:51.920] It will see people, it will feel their heat, it will hear them, it will sense humidity and temperature, and it will just be scattered around the house, right?
[00:31:52.080 --> 00:31:54.960] And I'm building the hardware myself.
[00:31:54.960 --> 00:31:59.440] And basically, as you walk into the house, it starts to adapt itself to you, right?
[00:31:59.440 --> 00:32:01.840] It manages energy in the house.
[00:32:01.840 --> 00:32:03.200] It knows you're in a room.
[00:32:03.200 --> 00:32:06.560] If you leave, my kids are not in their rooms right now.
[00:32:06.560 --> 00:32:11.520] Well, the HVAC around them just stopped and said, oh, they're not here.
[00:32:11.760 --> 00:32:16.720] When they come back, before they come back, an hour before they come back, it starts to pre-cool their space.
[00:32:16.720 --> 00:32:19.040] And then when they come back, it says, oh, they're here.
[00:32:19.040 --> 00:32:20.560] And it starts to do cool things.
[00:32:20.560 --> 00:32:23.520] So there's a lot of stuff like that, that a building.
[00:32:23.520 --> 00:32:26.160] And the question is, you don't want that to be completely inflexible.
[00:32:26.160 --> 00:32:28.320] You want it to be super dynamic and super human.
[00:32:28.320 --> 00:32:30.160] And so that's part of the AI.
[00:32:30.400 --> 00:32:32.560] I am the recipient of all of this.
[00:32:35.440 --> 00:32:36.720] I woke up at 1 a.m.
[00:32:36.800 --> 00:32:37.840] and I want toast.
[00:32:37.840 --> 00:32:39.360] Why is the toast not working?
[00:32:39.680 --> 00:32:42.240] That's because you shut off the power.
[00:32:42.480 --> 00:32:43.040] That's not good for you.
[00:32:45.440 --> 00:32:50.800] No, but look at, look, Isime walks into lots of rooms in the house and the light just turns on.
[00:32:50.800 --> 00:32:54.880] And when she leaves, she just walks out and then it knows she's not there and it turns it off.
[00:32:54.880 --> 00:32:55.920] And she's taking it for granted.
[00:32:56.240 --> 00:32:57.680] Well, it's actually doing all this.
[00:32:57.680 --> 00:32:58.320] Wow.
[00:32:58.560 --> 00:32:58.800] It is.
[00:33:00.400 --> 00:33:03.360] But there's also like a horror movie version of this.
[00:33:03.360 --> 00:33:04.800] Yes, there is.
[00:33:07.040 --> 00:33:13.200] Well, what I will say is that my family should not piss me off because I could do crazy things to them.
[00:33:14.800 --> 00:33:15.840] Oh, man.
[00:33:16.160 --> 00:33:16.800] Okay.
[00:33:17.760 --> 00:33:21.840] And this is built on like an LLM that you trained.
[00:33:22.320 --> 00:33:23.600] Is that what the powers is?
[00:33:23.920 --> 00:33:28.560] So multiple levels, there's this open source software called Home Assistant that I use.
[00:33:28.560 --> 00:33:31.400] And you can, it's very extensible.
[00:33:29.120 --> 00:33:34.840] So I can put in like fine-tuned LLMs.
[00:33:34.920 --> 00:33:38.120] I can use the big LLMs.
[00:33:39.080 --> 00:33:44.760] They're things like smaller spoken whisper models you can use to, like I can just say, hey, Jarvis.
[00:33:44.760 --> 00:33:50.120] like Iron Man, and the house will start talking to me and telling me things about what's going on.
[00:33:50.120 --> 00:33:51.480] So things like that.
[00:33:52.200 --> 00:33:53.800] Getting us back on track.
[00:33:54.680 --> 00:33:56.520] You guys are geeking out right now.
[00:33:58.120 --> 00:34:01.160] I'm excited and scared to visit this house.
[00:34:01.800 --> 00:34:03.400] That's good, my dad.
[00:34:04.680 --> 00:34:05.720] Get this out of me.
[00:34:05.960 --> 00:34:23.000] What I love about this is this is something that I've seen a bunch is having a very specific problem to solve for yourself is a really good way to build and get into these tools because you're motivated to solve a problem and it's like a specific thing that you're doing versus just generally, I'm going to go sit around and play with Lovable.
[00:34:23.000 --> 00:34:25.720] No, I think I want to double click on that, Lenny.
[00:34:25.720 --> 00:34:35.880] I was telling you the other time that we spoke that I get a lot of questions from, I would say, people who aren't very familiar with AI and they don't know where to start.
[00:34:35.880 --> 00:34:37.560] And that is exactly what I say.
[00:34:37.560 --> 00:34:40.680] It's like, hey, I know you love fashion, for example, right?
[00:34:40.680 --> 00:34:44.040] Or you like reading, right?
[00:34:44.360 --> 00:34:50.680] And you have the, you actually have the database of the entire Austin Library or whatever it is online.
[00:34:50.680 --> 00:34:54.760] And you can pull it if you really want to, and it can recommend based on your feelings.
[00:34:54.760 --> 00:34:57.960] So you can create a project around what it is that you're passionate about.
[00:34:57.960 --> 00:35:02.040] One particular woman, older lady, wanted to figure this out.
[00:35:03.080 --> 00:35:08.280] Figure out a passion project, and I knew she liked this idea called Outfit of the Day.
[00:35:08.280 --> 00:35:26.320] So with LLMs, she started a project where she actually had, she took the time to take all her tops, all her bottoms, all her dresses, all her shoes, and now she's working on, based on the temperature, having recommendations of her outfit for the day.
[00:35:26.320 --> 00:35:38.320] So that's one way she was able to find something that she loved and could take a, you know, leverage AI to build that could help her determine, okay, what should I wear or what are the options I should consider for today?
[00:35:38.320 --> 00:35:39.920] That's an amazing example.
[00:35:39.920 --> 00:35:45.040] Can I just make like one, you know, I'm always like a high concept prediction guy.
[00:35:45.040 --> 00:35:47.600] So I'm going to make one prediction.
[00:35:47.600 --> 00:35:58.640] I think that, you know, in the next few years, a lot of high agency people, not just PMs, will essentially be writing the software that automates their lives.
[00:35:59.120 --> 00:36:09.760] And so the one-man SaaS is going to go away because the ability to, it'll be hard to begin on developers because we will, people who really care will automate their lives.
[00:36:09.760 --> 00:36:12.400] They'll build applications that just work for them.
[00:36:12.400 --> 00:36:17.600] And the advice for PMs is that we should be the vanguard with that.
[00:36:17.600 --> 00:36:24.400] Because if we can start doing that for ourselves, we'll be able to do that for software that hits a million people because that's the way to get started.
[00:36:24.400 --> 00:36:26.160] And we also want to offer hope.
[00:36:26.160 --> 00:36:38.160] It's still day one, meaning that no matter how far, if you hear me and hear about fine-cheating and home assistance and all this crazy hardware that I'm trying to build, and you're scared, you're like, I'm behind.
[00:36:39.120 --> 00:36:40.240] You're not behind.
[00:36:40.240 --> 00:36:41.040] You're not behind.
[00:36:41.040 --> 00:36:48.320] People who seem like they're succeeding, we were here when the internet dropped and everyone, you know, Microsoft thought it was behind.
[00:36:48.320 --> 00:36:51.600] And, you know, with social, people, some people thought they were behind.
[00:36:51.600 --> 00:36:57.280] The people who seem like they're off the off-the-gate fastest, whether it's a company or person, not the people who win.
[00:36:57.280 --> 00:37:01.560] And so it's the wisest and the most dedicated to win.
[00:36:59.840 --> 00:37:03.480] And so people should just start.
[00:37:03.720 --> 00:37:08.280] You know, evil people, we have the proverb that says, whatever you wake up is your morning.
[00:37:08.280 --> 00:37:10.920] And so just wake up and go.
[00:37:11.160 --> 00:37:14.040] I love so much of the advice you two are sharing.
[00:37:14.920 --> 00:37:21.080] There's a post I'm going to link to in the show notes that shares a bunch of examples of what people are by coding building with AI.
[00:37:21.080 --> 00:37:22.920] That seems to be the biggest challenge for a lot of people.
[00:37:22.920 --> 00:37:24.680] It's just like, what should I actually do?
[00:37:24.680 --> 00:37:32.200] And something that I realized as I was putting that together is people think about like, oh, can you build real products and businesses on bold, lovable replic?
[00:37:32.440 --> 00:37:36.600] But interestingly, most people just want to build a thing that solves a specific problem for themselves.
[00:37:36.600 --> 00:37:41.960] And sometimes that turns into a thing that there's almost a bigger opportunity just to build all this personal software.
[00:37:42.440 --> 00:37:47.960] Similar to the example you shared, Ezine, there's a site, someone built this, like, how many layers should I wear today?
[00:37:48.200 --> 00:37:49.160] com or whatever.
[00:37:49.160 --> 00:37:52.040] I don't think that's the actual domain, and it's just like three.
[00:37:52.360 --> 00:37:54.520] And that's sometimes all you need to know.
[00:37:54.520 --> 00:37:57.080] Today's episode is brought to you by Coda.
[00:37:57.080 --> 00:38:02.440] I personally use Coda every single day to manage my podcast and also to manage my community.
[00:38:02.440 --> 00:38:06.520] It's where I put the questions that I plan to ask every guest that's coming on the podcast.
[00:38:06.520 --> 00:38:08.440] It's where I put my community resources.
[00:38:08.440 --> 00:38:10.120] It's how I manage my workflows.
[00:38:10.120 --> 00:38:11.960] Here's how Coda can help you.
[00:38:11.960 --> 00:38:15.240] Imagine starting a project at work and your vision is clear.
[00:38:15.240 --> 00:38:19.640] You know exactly who's doing what and where to find the data that you need to do your part.
[00:38:19.640 --> 00:38:29.480] In fact, you don't have to waste time searching for anything because everything your team needs from project trackers and OKRs to documents and spreadsheets lives in one tab all in Coda.
[00:38:29.480 --> 00:38:41.720] With Coda's collaborative all-in-one workspace, you get the flexibility of docs, the structure of spreadsheets, the power of applications, and the intelligence of AI, all in one easy-to-organize tab.
[00:38:41.720 --> 00:38:49.200] Like I mentioned earlier, I use Coda every single day, and more than 50,000 teams trust Coda to keep them more aligned and focused.
[00:38:49.200 --> 00:38:55.920] If you're a startup team looking to increase alignment and agility, Coda can help you move from planning to execution in record time.
[00:38:55.920 --> 00:39:02.480] To try it for yourself, go to coda.io/slash Lenny today and get six months free of the team plan for startups.
[00:39:02.480 --> 00:39:08.640] That's coda.io/slash lenny to get started for free and get six months of the team plan.
[00:39:08.640 --> 00:39:10.960] Coda.io/slash Lenny.
[00:39:11.280 --> 00:39:13.440] So, let me take this opportunity to zoom out a little bit.
[00:39:13.440 --> 00:39:18.720] We've been talking a lot about personal skills, what you think people need to get better at, what it takes to be successful.
[00:39:18.720 --> 00:39:20.560] I want to talk about at the company level.
[00:39:20.560 --> 00:39:24.560] You two work with a bunch of different companies, you've both worked at a bunch of different companies.
[00:39:24.560 --> 00:39:47.040] When you look at the companies that seem to be doing well in this new AI world, adopting the best practices, seeing productivity gains, versus the companies that are just struggling and not getting anywhere and just having a hard time, what do you find is most common across the one, the companies that are succeeding, shifting their culture and the way they work in this world of AI?
[00:39:47.040 --> 00:39:49.280] And I'll throw this to Esina first.
[00:39:49.600 --> 00:40:08.800] I think the main thing I will say is recognizing companies that recognize that AI is a core set of multiple capabilities that they've been gifted.
[00:40:08.800 --> 00:40:18.560] I think that you need to understand that it is not this magic thing that you're going to slather on to your product and it will do perfectly.
[00:40:18.560 --> 00:40:20.800] The problems are still the problems.
[00:40:20.800 --> 00:40:24.240] The customer is still the customer.
[00:40:24.240 --> 00:40:29.200] And their pain or sharp problems still exist as they have.
[00:40:29.560 --> 00:40:42.040] The idea now is: how do you take AI and not have it be something we call it, you put at the edge alone, but you actually re-transform the way you solve the customer's problem.
[00:40:42.040 --> 00:40:45.880] So we have this phrase called AI at the core and AI at the edge.
[00:40:45.880 --> 00:40:53.240] And the best way to think about it is that with AI at the edge, you're probably still using software as you always have.
[00:40:53.240 --> 00:40:56.280] The bits and bytes that you wrote before still are there.
[00:40:56.280 --> 00:41:04.360] You're instead inserting LLMs or AI at different intersections or connection points.
[00:41:04.360 --> 00:41:05.000] Okay?
[00:41:05.640 --> 00:41:28.840] AI at the core means fundamentally looking at the problem space and the workflows and using AI to solve the problem, not just sprinkling it at the intersections of GUIs or user interfaces that are out there or using it to just accelerate things.
[00:41:28.840 --> 00:41:44.920] So the way we talk about it is that companies where their code base remains and they are attaching AI to it often aren't the ones who are going to revolutionize their industry.
[00:41:45.240 --> 00:41:57.480] Companies for whom their code base perhaps shrinks and the LLM becomes a core part of what it is that they use to solve the problem are companies that actually are doing it right.
[00:41:57.800 --> 00:42:03.880] Another characteristic is specificity.
[00:42:04.520 --> 00:42:15.000] We've seen a lot of companies want to do so many things and expect LLMs to be as broad as the problem sets they want to solve for.
[00:42:15.920 --> 00:42:33.840] And we, in our experience, have seen that to do that, you actually will need to have specialized first and then create a layer of a connective tissue that can offer intelligence.
[00:42:33.840 --> 00:42:41.360] So we've seen companies instead just stay on top, try to build this massive solution set or LLM that can do it all.
[00:42:41.360 --> 00:42:54.960] But the successful ones that we've consulted with, partnered with, are talking to are those who instead have seen, you know what, I can find, I can build, with this team that I have on here, can actually build this very specific solution set.
[00:42:54.960 --> 00:42:56.240] This other team can do this.
[00:42:56.240 --> 00:43:07.440] And then my job is to tie it all together with a broader multi-model solution and build it and create an offering for the customer.
[00:43:07.440 --> 00:43:08.640] I think those are two core things.
[00:43:08.640 --> 00:43:11.040] I'm sure Oji can remember a few others.
[00:43:11.760 --> 00:43:17.600] No, I think, no, I love, I was humble in listening to you.
[00:43:19.200 --> 00:43:25.360] No, so maybe I'll embellish the point.
[00:43:25.680 --> 00:43:44.400] We're seeing companies that are starting from a blank slate, taking an old problem, a sharp problem, and saying, if we built it today with LLM as the core capability, as a core sort of blob of code, how would it look?
[00:43:44.400 --> 00:43:51.920] And often these people are finding a lot of acceleration and they're actually like taking on adjacencies.
[00:43:51.920 --> 00:44:10.920] So in a formless type form, you know, a new product built by David Okunev and me and a small group of people, we built a new type form and we ended up building more than it because it also became like a sales lead agent.
[00:44:10.920 --> 00:44:13.400] So like it did pre-sales.
[00:44:13.400 --> 00:44:19.560] which ordinarily Typeform gets you into qualification before you talk to a salesperson, but the product itself did pre-sale.
[00:44:19.560 --> 00:44:22.120] So that capability was so strong.
[00:44:22.440 --> 00:44:24.920] We're also seeing this shrinking thing.
[00:44:25.560 --> 00:44:33.320] It's funny, the last week I've seen companies that have been around for five years who, once LLMs came, they just reimagined their product.
[00:44:33.320 --> 00:44:37.880] Like Clay.ai is a six-year-old company, super successful.
[00:44:38.120 --> 00:44:45.720] So in the first category is like things like lovable, but in the second category of people who are like, okay, we're going to rethink this and we're going to do it.
[00:44:46.040 --> 00:44:52.760] Now, the problem for the rest of the industry is this, you know, the innovators dilemma.
[00:44:52.760 --> 00:44:55.800] They have lots of revenue tied to old code bases.
[00:44:55.800 --> 00:45:06.920] And so their job is to, and you know, we tell PM leaders all the time, is to navigate how you use LLMs on the old thing, but navigate to the new thing before someone eats your lunch.
[00:45:06.920 --> 00:45:09.800] And it's not, it's complicated, it's hard, but you have to do it.
[00:45:09.800 --> 00:45:12.840] Now, let me make maybe a couple of smaller things.
[00:45:13.800 --> 00:45:17.480] The people who are succeeding are taking chances on user experiences.
[00:45:17.480 --> 00:45:23.640] We don't believe the chat interface is the final boss for AI user experience.
[00:45:23.640 --> 00:45:31.400] We think that there's a reason UIs exist and the command line didn't work, otherwise, DOS will be the biggest operating system.
[00:45:31.640 --> 00:45:36.040] And so people have to do different things and be dynamic.
[00:45:36.080 --> 00:45:40.520] You know, one of the big things is dynamic user experiences that just personalize to the customer.
[00:45:40.680 --> 00:45:44.000] So we see a lot of people succeeding there.
[00:45:43.640 --> 00:45:51.120] And one of the things that we don't see people doing actually that we want people to do is something as mentioned, which is ethics.
[00:45:53.040 --> 00:46:05.120] We think of AI and its core capabilities as ordnance level, meaning like at some point we'll think of this as more powerful than the fusion, the fusion bomb.
[00:46:05.440 --> 00:46:13.360] And we have a bunch of software people who have no consideration of ethics play with this stuff all the time.
[00:46:13.360 --> 00:46:18.160] And so one thing we'd like to see the best companies do is lead on that.
[00:46:18.160 --> 00:46:24.800] Like think about the responsibility we have to the human race when we build new digital products or we give them superpowers.
[00:46:24.800 --> 00:46:27.680] I want to zoom out even further.
[00:46:28.000 --> 00:46:33.440] So we started talking about individual people, skills, what's going to be most important.
[00:46:33.440 --> 00:46:40.880] We've been talking just now about what companies are doing well, the ones that are succeeding in this crazy world of AI.
[00:46:40.880 --> 00:46:43.600] I want to now zoom out just your entire product career.
[00:46:43.600 --> 00:46:58.720] I'm curious just when you look back at the 50 plus years of combining experience you two have, what are some of the biggest product lessons that you've learned that you would love to share that you think people can avoid having to learn the hard way?
[00:46:58.720 --> 00:47:03.040] I think what I want to reiterate the thing that we talked about, sharp problems.
[00:47:04.000 --> 00:47:07.600] The problem you focus on is probably the most predictive of your success.
[00:47:07.600 --> 00:47:08.800] Pick a sharp problem.
[00:47:08.800 --> 00:47:13.520] Pick something that if you really solve it, people will come.
[00:47:14.400 --> 00:47:18.160] Watching Jeff Bezos from the 80s feels prescient.
[00:47:18.160 --> 00:47:20.400] He's like, look, people always shop.
[00:47:20.400 --> 00:47:21.840] I'm going to be the guy.
[00:47:22.640 --> 00:47:23.760] And he did it.
[00:47:23.760 --> 00:47:25.600] So do that.
[00:47:25.920 --> 00:47:35.640] And I say this humbly because I've failed, because sometimes we build things that we are passionate about, but being a founder is not an exercise.
[00:47:35.640 --> 00:47:40.280] And even an operator in a company, it's not an exercise in your passion and your brilliance.
[00:47:40.280 --> 00:47:42.920] It's an exercise in finding what the market needs.
[00:47:42.920 --> 00:47:43.640] So do that.
[00:47:43.640 --> 00:47:45.400] Be disciplined and doing that.
[00:47:45.400 --> 00:47:47.720] We also talk about simplicity a lot.
[00:47:48.040 --> 00:47:53.400] Even this morning, I was talking to a designer and he had a convoluted user experience.
[00:47:53.400 --> 00:47:58.040] And I said, only 20% of people who come to this will see that.
[00:47:58.040 --> 00:48:04.840] And so the point of design is first simplicity and clarity before we get fancy.
[00:48:04.840 --> 00:48:06.600] And so there's a lot.
[00:48:06.600 --> 00:48:09.400] Steve Jobs embodied this fairly on.
[00:48:09.400 --> 00:48:11.960] I think iOS has become complicated.
[00:48:11.960 --> 00:48:14.760] But like, start with simplicity.
[00:48:14.760 --> 00:48:16.840] And simple is hard.
[00:48:18.040 --> 00:48:21.560] And in 2025, we're designing for distracted brains.
[00:48:21.880 --> 00:48:23.000] A couple of things are happening.
[00:48:23.000 --> 00:48:24.600] People are super distracted.
[00:48:24.600 --> 00:48:26.440] People are no longer wild by technology.
[00:48:26.440 --> 00:48:28.200] You call it AI all you want.
[00:48:28.200 --> 00:48:30.760] They just assume that it is what it is.
[00:48:30.760 --> 00:48:35.960] And so simplicity is really good for 2025 and beyond, basically.
[00:48:37.480 --> 00:48:40.360] And then, you know, we talk a lot about fundamentals.
[00:48:40.360 --> 00:48:41.800] I'm going to throw this to you, is in there.
[00:48:41.800 --> 00:48:45.160] Like, there are ways to figure out the fundamentals.
[00:48:45.160 --> 00:48:51.160] What we do, I know we operate in the software layer, but it's most basic.
[00:48:51.160 --> 00:48:56.280] We're trying to predict what humans would do, what they will want, how their workflow will go.
[00:48:56.280 --> 00:49:09.080] And there's some, you know, there's a layer of that that is quite similar, you know, because our friends, the anthropologists and the psychologists, really understand this.
[00:49:09.080 --> 00:49:11.480] And I think if we understand it, we can build better software.
[00:49:11.480 --> 00:49:17.200] So, but isn't Izan, I want to let you, I'm just going to throw this up to you so I don't take up all the space.
[00:49:14.680 --> 00:49:17.680] Go for it.
[00:49:17.920 --> 00:49:38.480] No, I think you're on to something about just so when I think about the hard lessons I've learned, you talked about simplicity, and I'm saying this also from humility and just having learned a lot from past failures.
[00:49:38.800 --> 00:49:49.040] One of the reasons people create complicated solutions is because they are afraid to take a stab, to put their point of view.
[00:49:49.200 --> 00:49:50.720] Opinion out there.
[00:49:50.800 --> 00:49:51.280] Not opinion.
[00:49:52.000 --> 00:49:54.000] They're not opinionated enough.
[00:49:54.000 --> 00:50:01.280] And that often comes from not having high conviction because you're not sure that you've spent the time with the customer.
[00:50:01.280 --> 00:50:10.480] So, with that, I know it sounds trite, but spend time and understand the customer or borrow from others who do.
[00:50:10.480 --> 00:50:19.440] But you've got to not just lay it all out and have them choose which way because it is so effing confusing.
[00:50:19.440 --> 00:50:22.160] 10,000 options because you can't pick one.
[00:50:22.160 --> 00:50:26.080] Yeah, the experience is so murky, chalky.
[00:50:26.080 --> 00:50:30.640] Nobody knows what the hell to do because you can't make a decision.
[00:50:30.640 --> 00:50:31.920] Make it simple.
[00:50:32.320 --> 00:50:35.440] Come up with an opinion and put your opinion out there.
[00:50:35.440 --> 00:50:36.880] You can change.
[00:50:36.880 --> 00:50:39.840] Leave the configurations and the options behind.
[00:50:39.840 --> 00:50:41.760] But leave them behind the scenes if you must.
[00:50:41.760 --> 00:50:48.400] But try and create the most compelling, the simplest experience, and you will actually have better adoption.
[00:50:48.400 --> 00:50:51.200] You're better off doing that and being wrong.
[00:50:51.200 --> 00:51:03.080] Many times we leave and ship convoluted experiences because we don't have the courage, number one, to make a decision, create a point of view, and ship it out there.
[00:51:03.400 --> 00:51:14.120] The reality is, and my experience has shown that you're better off picking an opinion, shipping it out there, being wrong, and then adjusting.
[00:51:14.120 --> 00:51:20.920] that leaving too many options because you then can never learn what was the better experience overall.
[00:51:20.920 --> 00:51:23.160] So that's one thing.
[00:51:23.160 --> 00:51:29.720] The other major lesson I've learned is actually one about strategy.
[00:51:29.720 --> 00:51:41.240] It is that the best way to go from strategy to execution in a way that activates the entire organization is communication.
[00:51:41.480 --> 00:51:53.640] You will never, never spend too much time communicating the why to your organization over and over and over again.
[00:51:53.880 --> 00:52:02.920] I don't remember what movie it was, but I think it was the movie about sending someone to the moon, right?
[00:52:05.480 --> 00:52:06.760] I'm forgetting what it is.
[00:52:06.760 --> 00:52:12.840] But the idea that everybody in NASA knew that we were working to send a man to the moon.
[00:52:12.840 --> 00:52:15.480] Even the janitor did, knew that.
[00:52:15.480 --> 00:52:21.960] So that idea of ensuring that everybody in the team understands the why behind what we're doing and can speak to it.
[00:52:21.960 --> 00:52:24.040] I saw something on LinkedIn the other day.
[00:52:24.040 --> 00:52:28.280] I think it was Martin Erickson who actually pushed up, posted this.
[00:52:28.280 --> 00:52:42.520] He talked about how the same way we talk about crossing the chasm and figuring out who it is that will adopt a product, we should think about the change management required for strategy communication through that lens.
[00:52:42.520 --> 00:52:44.800] There will be early adopters.
[00:52:44.360 --> 00:52:45.920] Think about those people.
[00:52:46.240 --> 00:52:57.360] Those people will hear the message, the strategy, and by the time others are just beginning to understand what it is, they're probably on the next version of the strategy, asking what can change, what should change.
[00:52:57.360 --> 00:53:04.000] And that was so clarifying for me because I always wondered why strategy could get muddy sometimes.
[00:53:04.000 --> 00:53:09.040] It's because when you look at your entire population, about 5% are going to be the early adopters.
[00:53:09.040 --> 00:53:12.960] They get it, they buy into it, and they're moving.
[00:53:12.960 --> 00:53:17.440] Four months later, they're onto something wondering what the next version of the strategy will look like.
[00:53:17.440 --> 00:53:19.200] How do we build on top of this?
[00:53:19.200 --> 00:53:23.280] At that same time, some people are just beginning to get it.
[00:53:23.600 --> 00:53:36.480] And honestly, that answered a lot of questions for me as to how it is that you can be working on a strategy, some people get it, can recite it, tell you why we're doing what we're doing, while some PMs are still struggling with, well, why?
[00:53:36.800 --> 00:53:37.600] How come?
[00:53:37.600 --> 00:53:38.560] Can we do this?
[00:53:38.560 --> 00:53:41.440] Because they are, in fact, laggards out there.
[00:53:41.440 --> 00:53:56.000] So, that idea of trying to really activate a strategy into pure execution, if you think about communication as a critical element, and then also look at it as how many people, what is the crossing the chasm version?
[00:53:56.000 --> 00:53:59.360] Who is at what point in this change management journey?
[00:53:59.360 --> 00:54:05.440] I think that is actually, that was one of the major lessons I've learned as a leader in product.
[00:54:05.440 --> 00:54:14.320] I love these sorts of conversations where you two spend 50 years learning and then you just tell us, here's all the answers, here's a bunch of stuff that'll save you a bunch of pain and suffering.
[00:54:14.320 --> 00:54:17.520] Let me try to summarize what you shared and tell me what I missed.
[00:54:17.520 --> 00:54:30.000] So, some of the biggest things you two have learned over the past 50 years, and it's not in the right order, but communicating the why, making sure that you can never spend too much time helping people understand why what you're working on is important.
[00:54:30.600 --> 00:54:37.960] And then there's just the change management component of not underestimating the work that it'll take to convince people to adopt something.
[00:54:37.960 --> 00:54:39.960] And then, this whole idea of crossing the chasm.
[00:54:39.960 --> 00:54:44.440] We had Jeffrey Moore on the podcast, so we'll point to that if you want to deeply understand what that's all about.
[00:54:44.440 --> 00:54:57.640] Then, this lesson of simplicity, just keep per just the value of keeping things simple, easier said than done, and having an opinion, having the courage almost to do something really simple, not give people all the options.
[00:54:57.640 --> 00:55:05.320] And then, this Aji, you meant you talked about the idea of most times when something fails, it's the wrong, you're going after the wrong problem.
[00:55:05.320 --> 00:55:10.280] It's not sharp enough, or it's just not solving people, something people actually care about.
[00:55:10.600 --> 00:55:16.840] And then, just talking to customers, always easy to say, most people don't actually do it enough.
[00:55:16.840 --> 00:55:19.160] I want to add one more thing, and I'll be brief.
[00:55:19.160 --> 00:55:23.320] This is more about people's careers, like PMs.
[00:55:23.320 --> 00:55:36.280] In our career, we have, you know, we started out as engineers basically, became mainline PMs, got an MBA, became, you know, worked harder, became executives, and so on and so forth.
[00:55:36.280 --> 00:55:40.600] So, some, you know, people look at my career as an escrow and they're like, oh, this is amazing.
[00:55:40.600 --> 00:55:44.920] But behind that is a lot of intention.
[00:55:44.920 --> 00:55:51.240] Like, we always held in our brains what do we want to do next?
[00:55:51.240 --> 00:55:54.040] Like, what is the next step?
[00:55:54.040 --> 00:55:57.000] What is the hunger that drives us?
[00:55:57.000 --> 00:56:03.000] And there's nothing more powerful than intention, than like imagination.
[00:56:03.000 --> 00:56:08.840] It's like, you know, a little bit of cheese in front of the mouse, like, it chases after.
[00:56:08.840 --> 00:56:16.880] Being able to visualize and chase the thing that you see about where you want to be is so powerful.
[00:56:14.840 --> 00:56:19.680] I talked to a guy who gave me a break.
[00:56:19.920 --> 00:56:27.120] So when I was, you know, the last two years of my time at Microsoft, I switched to become a marketing lead.
[00:56:28.080 --> 00:56:32.640] I was a little tired of PM at Microsoft specifically, and I was getting an MBA.
[00:56:32.960 --> 00:56:34.720] And I talked to Dave Mendelin.
[00:56:34.880 --> 00:56:42.240] He was vice president, you know, or senior director in the, I worked, you know, eventually for Satya.
[00:56:42.560 --> 00:56:46.320] And he gave me a break and became a product marketer for a couple of years.
[00:56:46.320 --> 00:56:56.880] But he came back and talked to me this year and he said, OG, everything you told me you wanted to do from afar, I saw that you have done.
[00:56:57.200 --> 00:56:59.520] And you told me about them 10 years ago.
[00:56:59.520 --> 00:57:01.200] And I didn't even know that.
[00:57:01.200 --> 00:57:03.040] And so it's a very powerful thing.
[00:57:03.040 --> 00:57:08.000] I think that this is not just about how to build better products, but how to manage your own career is super important.
[00:57:08.000 --> 00:57:12.320] I just went to a talk, storytelling event, and there was a very similar theme.
[00:57:12.320 --> 00:57:15.280] This guy just always wanted to be a stand-up comedian.
[00:57:15.280 --> 00:57:20.960] And he, the lesson he learned, is just you need to, the best motivator is desire.
[00:57:22.080 --> 00:57:26.960] Having that desire not fade, because that'll, it's kind of like going back to our idea of the project.
[00:57:26.960 --> 00:57:33.200] Having a project to work on because you need a problem solved is a really good way to just get to where you want to go.
[00:57:33.200 --> 00:57:33.680] Yeah.
[00:57:33.920 --> 00:57:36.240] Ezina, I think you had something else you wanted to add.
[00:57:36.240 --> 00:57:40.560] Yeah, you know, you said this thing about, you know, know the customer, know the customer.
[00:57:40.560 --> 00:57:47.120] And I think there's a lot of info, there's a lot of noise out there about customer interviews, et cetera, right?
[00:57:47.120 --> 00:57:52.720] But I do want to remind people that there's a difference between what the customer says, right?
[00:57:52.720 --> 00:57:55.920] Customer discovery, asking them things, right?
[00:57:56.240 --> 00:57:58.400] And AI is really good at that right now.
[00:57:58.400 --> 00:58:00.760] Like, you can help you get all the transcriptions.
[00:57:59.920 --> 00:58:03.720] But just remember that what they do is actually more important.
[00:58:04.520 --> 00:58:18.920] So earlier in my career, I had the opportunity to have a UX lab that I worked in, and ethnographic research, watching what people truly were doing, still is top-notch.
[00:58:18.920 --> 00:58:20.920] It's the best thing you can ever do.
[00:58:20.920 --> 00:58:23.640] So, what does that look like digitally?
[00:58:23.640 --> 00:58:36.200] It's still the core instrumentation, figuring out how people finish things, but also true customer insight comes from not using AI to read all the transcripts of your customer interviews.
[00:58:36.200 --> 00:58:37.960] That isn't what it is.
[00:58:37.960 --> 00:58:41.960] It's actually trying to figure out how to best walk in their shoes.
[00:58:41.960 --> 00:58:47.160] And whatever it takes, you've got to figure out how to do that because it's not what they say they do.
[00:58:47.160 --> 00:58:48.520] It's not what they say they want.
[00:58:48.920 --> 00:58:52.040] That's not the true intimacy you want with a customer.
[00:58:52.040 --> 00:58:59.320] You actually want to observe them as much as possible and understand the why behind the actions they're taking.
[00:58:59.320 --> 00:59:07.320] And I think that's a really important thing to point out right now because I think people are cheating with AI through interviews, et cetera, and thinking that they're going to figure it out.
[00:59:07.320 --> 00:59:19.640] But there's a different type of insight that comes from what I call true ethnographic research, studying, observing, understanding the driver behind their need, their problem or the need that they're trying to solve.
[00:59:20.120 --> 00:59:34.040] And I just want to make sure we say that because I think people think AI has come to save it all, but no, it's going to give you junk because people don't always say what they mean or they don't realize that they're lying about their reasons for doing a thing.
[00:59:34.040 --> 00:59:39.560] I think some of these lessons, especially the last one, is the kind of lesson that you can hear it and be like, yeah, yeah, I get it.
[00:59:39.560 --> 00:59:43.080] And then you have to actually learn it and find the times.
[00:59:43.080 --> 00:59:43.880] Oh, wow, okay.
[00:59:43.880 --> 00:59:46.320] Everyone told me they wanted this thing, and nobody actually used it.
[00:59:46.320 --> 00:59:48.800] Okay, I remember as Ine said that on a podcast.
[00:59:44.920 --> 00:59:50.160] Oh, I get it now.
[00:59:51.040 --> 00:59:54.960] And so I think, you know, there's only so much we can do to help people learn these things.
[00:59:54.960 --> 00:59:58.720] Sometimes you just have to fail to learn.
[00:59:59.360 --> 01:00:03.120] Okay, is there anything that we have not touched on?
[01:00:03.120 --> 01:00:09.120] Anything that you wanted to share before we get to our very exciting lightning round?
[01:00:09.120 --> 01:00:10.640] I think we need to talk about ethics.
[01:00:10.640 --> 01:00:26.640] And as people of color and parents, I think that having a call to action for our PMs to really be thoughtful about what it is they're building and what it is they're integrating into their product is really important to us.
[01:00:26.640 --> 01:00:36.480] We just want to call PMs that this awesome power we have to direct lots of developers comes with responsibility, the Spider-Man trope.
[01:00:36.480 --> 01:00:39.120] And, you know, we created social media.
[01:00:39.120 --> 01:00:40.160] I worked at Twitter.
[01:00:40.160 --> 01:00:43.440] And I always like, oh my God, like, how many mistakes do we make?
[01:00:43.440 --> 01:00:50.880] Like, I remember the person who invented DeepFates was interviewed one time and she said, and they're like, what are the implications of your stuff?
[01:00:50.880 --> 01:00:52.880] And she's like, I don't care.
[01:00:53.200 --> 01:00:55.120] And I was so shocked.
[01:00:55.120 --> 01:00:55.760] I was like, what?
[01:00:55.920 --> 01:00:57.440] Software engineer, you don't care?
[01:00:57.440 --> 01:00:59.280] She's like, oh, it's all evented countermeasure.
[01:00:59.280 --> 01:01:00.320] I'm like, oh, my God.
[01:01:00.640 --> 01:01:02.880] So, PMs, we shouldn't be like that.
[01:01:03.520 --> 01:01:04.560] We shouldn't be like that.
[01:01:04.720 --> 01:01:09.200] I guess the one thing I also wanted to say is like, one pain point we hear a lot is strategy.
[01:01:09.600 --> 01:01:15.840] PMs who are like senior PM, principal PM, worry about, oh my God, how can I do product strategy?
[01:01:16.160 --> 01:01:20.800] And in the book, we talk a lot about the sources of strategy.
[01:01:20.800 --> 01:01:23.840] And it's not mysterious.
[01:01:23.840 --> 01:01:34.280] I think there's things you need to learn about, like the sources of competitive advantage and even what that means, like intellectual property or like economies of scale, economies of scope.
[01:01:34.600 --> 01:01:39.640] Because there's no formal education about this stuff, PMs don't have the fundamentals they need.
[01:01:39.640 --> 01:01:54.600] Because it's easy to learn customer discovery, but like learning the seven levers of competitive advantage, maybe you've never
Prompt 2: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 3: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Prompt 5: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 2 of 2 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
important to us.
[01:00:26.640 --> 01:00:36.480] We just want to call PMs that this awesome power we have to direct lots of developers comes with responsibility, the Spider-Man trope.
[01:00:36.480 --> 01:00:39.120] And, you know, we created social media.
[01:00:39.120 --> 01:00:40.160] I worked at Twitter.
[01:00:40.160 --> 01:00:43.440] And I always like, oh my God, like, how many mistakes do we make?
[01:00:43.440 --> 01:00:50.880] Like, I remember the person who invented DeepFates was interviewed one time and she said, and they're like, what are the implications of your stuff?
[01:00:50.880 --> 01:00:52.880] And she's like, I don't care.
[01:00:53.200 --> 01:00:55.120] And I was so shocked.
[01:00:55.120 --> 01:00:55.760] I was like, what?
[01:00:55.920 --> 01:00:57.440] Software engineer, you don't care?
[01:00:57.440 --> 01:00:59.280] She's like, oh, it's all evented countermeasure.
[01:00:59.280 --> 01:01:00.320] I'm like, oh, my God.
[01:01:00.640 --> 01:01:02.880] So, PMs, we shouldn't be like that.
[01:01:03.520 --> 01:01:04.560] We shouldn't be like that.
[01:01:04.720 --> 01:01:09.200] I guess the one thing I also wanted to say is like, one pain point we hear a lot is strategy.
[01:01:09.600 --> 01:01:15.840] PMs who are like senior PM, principal PM, worry about, oh my God, how can I do product strategy?
[01:01:16.160 --> 01:01:20.800] And in the book, we talk a lot about the sources of strategy.
[01:01:20.800 --> 01:01:23.840] And it's not mysterious.
[01:01:23.840 --> 01:01:34.280] I think there's things you need to learn about, like the sources of competitive advantage and even what that means, like intellectual property or like economies of scale, economies of scope.
[01:01:34.600 --> 01:01:39.640] Because there's no formal education about this stuff, PMs don't have the fundamentals they need.
[01:01:39.640 --> 01:01:54.600] Because it's easy to learn customer discovery, but like learning the seven levers of competitive advantage, maybe you've never seen, or learning the 10 ways that companies can grow, which is not just product, sometimes it's distribution, sometimes a superior business strategy.
[01:01:54.600 --> 01:02:12.600] And these are ideas that if PMs had good material, it's sort of like packed in the middle of business school, you can actually become a person who helps direct your company on where to go and do it in such a way that's so compelling you can talk about it that you know it's good for your career.
[01:02:12.600 --> 01:02:18.280] I remember Elaine, I think one time I saw maybe a long time ago, I saw something you're written around how do you make money?
[01:02:18.280 --> 01:02:32.520] You know, you sell a thing, you rent a thing, you know, there many people think that it's there are many, many, like it's this amorphous thing, but you may not have the truly exhaustive list, right?
[01:02:32.520 --> 01:02:43.400] But there are a fixed number of things that you can know to create strategy to make money and just doing the work or doing the research to figure out, okay, hey, how does one make money?
[01:02:43.400 --> 01:02:45.480] How does one get competitive advantage?
[01:02:45.480 --> 01:02:50.600] In the book, we wrote about 10 growth levers and seven levels of competitive advantage.
[01:02:50.600 --> 01:03:01.720] And I think if you master that, you have like 80% of your tool set that makes you a compelling director or CPO that can play at that level.
[01:03:02.120 --> 01:03:03.160] And it's super important.
[01:03:03.160 --> 01:03:04.360] Otherwise, you're sort of guessing.
[01:03:04.360 --> 01:03:06.920] You're like, oh, did this deck work for my CEO?
[01:03:06.920 --> 01:03:08.760] No, that's not what you should think about it.
[01:03:08.920 --> 01:03:21.840] I'll point to the post that you're mentioning, Ezine, that is all the ways to make money, but let's use this opportunity to talk about your book, just what people would get out of it, why they should buy it, do the pitch, and then we'll get to the very exciting lightning round.
[01:03:22.160 --> 01:03:31.040] In our career, 50 plus years, when we started this product management thing, we thought, oh, this is going to be mainstream.
[01:03:31.040 --> 01:03:31.840] There's development.
[01:03:31.840 --> 01:03:33.840] There are 30 million developers, whatever.
[01:03:33.840 --> 01:03:36.640] And there's going to be product managers all over the world.
[01:03:36.640 --> 01:03:48.720] But operating in the US, operating in Europe, in Africa, it turns out the best PMs are so ghettoed somewhere in the west coast of the United States, right?
[01:03:48.720 --> 01:03:49.920] We know this.
[01:03:49.920 --> 01:03:51.840] And we don't like that at all, right?
[01:03:51.840 --> 01:03:53.360] We think opportunity should be everywhere.
[01:03:53.360 --> 01:04:06.320] So we wrote Building Rocket Ships to help people figure out how to build not only great products, but the companies that build great products to make sure that people have the knowledge not to be one-hit wonders.
[01:04:06.320 --> 01:04:10.560] And the book is divided into two: one is the fundamentals.
[01:04:10.560 --> 01:04:15.840] If you're a senior to principal PM, the fundamentals of building a great product, right?
[01:04:15.840 --> 01:04:18.720] All the way from simplicity to even pricing.
[01:04:19.040 --> 01:04:24.320] And then the second part is how do you lead a high-performing shipyard?
[01:04:24.320 --> 01:04:40.720] And how do you go from motivating people to charting the course for the future, depending on where you're in the market, to make, you know, like Ezena said, to commanding a shipyard that is kick-ass.
[01:04:41.200 --> 01:04:42.640] That's the book.
[01:04:42.640 --> 01:04:50.080] And, you know, in the latter part, we also talk about what's going to happen in terms of AI over the next few years.
[01:04:50.480 --> 01:04:53.200] So, we are, it's called Building Rocket Ships.
[01:04:53.200 --> 01:04:54.000] We're very proud of it.
[01:04:54.000 --> 01:04:55.520] It's right over here.
[01:04:55.520 --> 01:05:03.160] And we think it's awesome because it represents a lot of the experience that we've had over these years and a lot of the success.
[01:05:03.480 --> 01:05:05.560] And where do folks find it if they want to go check it out?
[01:05:05.560 --> 01:05:08.520] Maybe you can buy one or two or ten or hundred.
[01:05:08.520 --> 01:05:11.480] Some companies have been buying a lot actually in bulk.
[01:05:11.480 --> 01:05:13.240] So, this is a note for the companies.
[01:05:13.240 --> 01:05:15.320] But you can buy it on Amazon.
[01:05:15.560 --> 01:05:17.320] You can find it on Shopify.
[01:05:17.320 --> 01:05:22.680] On Shopify, if you go for it, there's an additional Pro Edition.
[01:05:22.680 --> 01:05:24.840] The Pro Edition is like a team product.
[01:05:25.000 --> 01:05:30.040] The book is on coda.io, and you can share it with your team.
[01:05:30.040 --> 01:05:31.000] You can take notes.
[01:05:31.000 --> 01:05:42.920] There's almost double the amount of content because it contains checklists, templates, tools for PMs that they can download and just use immediately.
[01:05:42.920 --> 01:05:46.840] It's almost like code for you, for your PM career.
[01:05:46.840 --> 01:05:50.680] We have the book, and then we have the Coda version.
[01:05:50.680 --> 01:05:54.200] And one of the things we really worked on was making it actionable.
[01:05:54.200 --> 01:06:07.160] So the minute you get the Coda version, you can read a chapter and then you'll have templates, frameworks, step checklists, whatever it is that we felt makes sense.
[01:06:07.160 --> 01:06:09.320] We made that available via the coda version.
[01:06:09.320 --> 01:06:11.080] What a motherload of value.
[01:06:11.880 --> 01:06:15.240] People always have issues saying the URL on podcasts.
[01:06:15.240 --> 01:06:16.200] I just want to make sure what's that?
[01:06:16.200 --> 01:06:17.400] How do you actually find this version?
[01:06:17.400 --> 01:06:20.600] You said it's on Shopify, but there's probably a domain name to get there.
[01:06:20.600 --> 01:06:33.800] So if you go to www.productmind.co slash brpro, you can buy the Pro Edition.
[01:06:34.120 --> 01:06:41.160] And if you've bought it, you can also use it to log into the coda edition, and you're off to the races.
[01:06:41.160 --> 01:06:41.800] There we go.
[01:06:41.800 --> 01:06:42.280] Amazing.
[01:06:42.280 --> 01:06:44.200] We'll also link to it in the show notes.
[01:06:44.200 --> 01:06:47.760] And with that, we've reached our very exciting lightning round.
[01:06:44.680 --> 01:06:49.440] I've got five questions for you, too.
[01:06:49.440 --> 01:06:50.320] Are you ready?
[01:06:50.640 --> 01:06:51.280] Yes.
[01:06:51.280 --> 01:06:55.760] First question is: what are two or three books you find yourself recommending most to other people?
[01:06:55.760 --> 01:06:56.800] We just did one.
[01:06:56.800 --> 01:06:57.680] Building Rocket Ships.
[01:06:57.760 --> 01:07:01.760] Building Rocket Ships is the first one, so we're not afraid to self-promote.
[01:07:01.760 --> 01:07:04.000] But also, the other one for me is Build.
[01:07:04.000 --> 01:07:05.200] Tony Fidel.
[01:07:05.200 --> 01:07:12.160] You know, I ran into Tony in Nigeria building an accelerator, and it was just incredible book.
[01:07:12.160 --> 01:07:13.840] So I totally recommend it.
[01:07:13.840 --> 01:07:17.520] One for me, and it's both for work and life, etc.
[01:07:17.840 --> 01:07:19.920] This one called The Let Them Theory.
[01:07:19.920 --> 01:07:21.360] I think it was Mel Robbins.
[01:07:21.520 --> 01:07:22.160] I love it.
[01:07:22.160 --> 01:07:34.080] I just think that many times PMs are control freaks or A type and being able to leave people to their own devices sometimes is hard.
[01:07:34.080 --> 01:07:40.640] So let them remind us that we get a lot more rest for ourselves when we let people be.
[01:07:40.960 --> 01:07:42.880] I was just listening to that book on audio.
[01:07:42.880 --> 01:07:46.480] I was like, it's sitting at the top of the bestseller like every freaking week.
[01:07:46.480 --> 01:07:48.000] What is this book?
[01:07:48.320 --> 01:07:48.960] And it's so good.
[01:07:48.960 --> 01:07:50.240] I sat on your background actually.
[01:07:50.240 --> 01:07:51.520] I noticed that you're reading it.
[01:07:51.760 --> 01:07:52.640] Yeah, it's so good.
[01:07:52.640 --> 01:07:53.680] It's such a simple idea.
[01:07:53.840 --> 01:07:56.640] And the thing I learned at listening to it is there's a second part to it.
[01:07:56.640 --> 01:07:57.680] It isn't just let them.
[01:07:57.680 --> 01:07:58.000] I don't know.
[01:07:58.160 --> 01:07:58.400] Let them.
[01:07:59.440 --> 01:08:00.080] Yeah, I did.
[01:08:00.080 --> 01:08:01.200] It's about let me.
[01:08:01.200 --> 01:08:01.760] Yeah, exactly.
[01:08:01.920 --> 01:08:02.240] Let me.
[01:08:02.480 --> 01:08:02.880] Such a guy.
[01:08:02.960 --> 01:08:03.600] I've been using it.
[01:08:03.600 --> 01:08:04.640] It's really powerful.
[01:08:04.640 --> 01:08:05.200] I get why.
[01:08:06.240 --> 01:08:08.800] I didn't, I mean, the book is good.
[01:08:08.800 --> 01:08:10.000] Yeah, let's give her credit.
[01:08:10.000 --> 01:08:14.000] But it's such a simple thing that other people in many religions, et cetera, have used.
[01:08:14.000 --> 01:08:25.520] But she grounds, she grounds, she shows you why it's necessary and how we show up trying to control others in other situations that are very common to most people.
[01:08:25.520 --> 01:08:26.800] So I like that.
[01:08:26.800 --> 01:08:28.400] I never thought of it in the PM context.
[01:08:28.400 --> 01:08:29.240] That is very cool.
[01:08:29.040 --> 01:08:31.000] And that's by Mel Robinson, by the way.
[01:08:31.320 --> 01:08:35.960] Okay, next question: Is there a favorite recent movie or TV show that you really enjoyed?
[01:08:35.960 --> 01:08:38.680] So we watched a few shows together recently.
[01:08:38.680 --> 01:08:45.080] So I would say one that we loved, and we couldn't, we kept saying, gosh, we can't wait for our kids to be old enough to watch.
[01:08:45.080 --> 01:08:47.560] It was one called Forever.
[01:08:48.600 --> 01:08:51.160] It was just really, really good.
[01:08:53.000 --> 01:08:54.200] It's a love story.
[01:08:54.200 --> 01:08:56.920] It's one of these books.
[01:08:57.000 --> 01:09:01.400] It was a book reimagined, done from an African-American point of view.
[01:09:01.400 --> 01:09:03.240] It was just really well done.
[01:09:03.240 --> 01:09:04.600] So that's a TV show.
[01:09:04.600 --> 01:09:06.280] It's called Forever.
[01:09:06.280 --> 01:09:09.720] And another one that we really like, because I think it was Sterling K.
[01:09:09.800 --> 01:09:11.560] Brown is Paradise.
[01:09:11.720 --> 01:09:12.840] Really enjoyed it.
[01:09:12.840 --> 01:09:14.120] It was interesting.
[01:09:14.920 --> 01:09:15.320] Excellent.
[01:09:15.640 --> 01:09:16.200] It kept us.
[01:09:16.520 --> 01:09:17.000] Yeah.
[01:09:17.000 --> 01:09:17.960] Adju, what about you?
[01:09:18.120 --> 01:09:18.920] Mine is on a roll.
[01:09:18.920 --> 01:09:23.240] And the movie that we've been geeking out on is Sinners.
[01:09:23.880 --> 01:09:25.240] We just love Sinners.
[01:09:25.240 --> 01:09:26.200] I think I saw it twice.
[01:09:26.200 --> 01:09:29.240] My son saw it like three or four times.
[01:09:30.040 --> 01:09:33.000] We're just big fans of Koogler and we tell stories.
[01:09:33.000 --> 01:09:34.360] So we love that.
[01:09:34.760 --> 01:09:40.120] I've seen both those two recently, and they both have very unexpected, they've been unexpected directions.
[01:09:40.120 --> 01:09:41.640] I'm like, why did I see this coming?
[01:09:41.960 --> 01:09:42.840] Yes.
[01:09:42.840 --> 01:09:43.640] Yes.
[01:09:43.640 --> 01:09:44.200] Oh, man.
[01:09:44.200 --> 01:09:45.800] Yeah, that movie is beautiful, Sinners.
[01:09:45.800 --> 01:09:46.600] I was like, wow.
[01:09:46.920 --> 01:09:48.600] But I usually hate scary movies.
[01:09:48.600 --> 01:09:49.240] I enjoyed that one.
[01:09:49.240 --> 01:09:52.520] Someone else recently mentioned that movie too, actually, on the podcast.
[01:09:52.520 --> 01:09:56.040] Next question: Is there a favorite product you recently discovered that you really love?
[01:09:56.040 --> 01:09:59.400] Could be a gadget, could be clothes, could be a kitchen device, could be an app.
[01:09:59.400 --> 01:09:59.800] I'll start.
[01:09:59.880 --> 01:10:01.240] I mean, I love Claude.
[01:10:01.240 --> 01:10:03.320] It's like Adi says, it's my sidekick.
[01:10:03.320 --> 01:10:06.360] I use it for anything and everything.
[01:10:07.640 --> 01:10:14.400] But obviously, the thing you learn is: don't, even though you keep talking about it as generative AI, be the generator and let it refine.
[01:10:14.400 --> 01:10:15.600] That's the major trick.
[01:10:15.840 --> 01:10:17.120] That's the hack everybody should learn.
[01:10:17.280 --> 01:10:17.920] Super hack.
[01:10:14.280 --> 01:10:18.560] Don't generate.
[01:10:19.120 --> 01:10:22.320] Generate first and let it refine and let it help you.
[01:10:22.320 --> 01:10:24.240] Otherwise, you're going to get bullshit.
[01:10:24.240 --> 01:10:24.560] Sorry.
[01:10:24.560 --> 01:10:25.760] That's a really good tip.
[01:10:26.400 --> 01:10:32.320] And one, I'll say I have this new Nespresso machine, the Virtuo, that Oji got from me.
[01:10:32.640 --> 01:10:37.120] Everybody is using it now, and I'm kind of pissed, but it's still my favorite product.
[01:10:37.120 --> 01:10:38.800] But everybody's in it.
[01:10:39.200 --> 01:10:42.960] The reason is, and now we like keeping it really clean and shiny.
[01:10:42.960 --> 01:10:46.400] And now that everybody's using it, it's not as clean and shiny as it used to be.
[01:10:46.560 --> 01:10:48.960] When she says everybody, she means everyone else in the house.
[01:10:48.960 --> 01:10:50.320] She wants to use it by herself.
[01:10:50.640 --> 01:10:51.120] Let's be clear.
[01:10:51.360 --> 01:10:53.520] She needs a second version just for herself.
[01:10:53.520 --> 01:10:55.360] Yeah, we need another one.
[01:10:55.360 --> 01:10:58.960] And then the plebs and the pluritorate can use the other ones.
[01:10:58.960 --> 01:10:59.280] That's right.
[01:10:59.280 --> 01:11:00.000] That's what it is.
[01:11:00.400 --> 01:11:00.880] When they're good.
[01:11:00.880 --> 01:11:01.360] When they're good.
[01:11:01.600 --> 01:11:04.000] Make all my lattes, hot, cold, all of that stuff.
[01:11:04.000 --> 01:11:04.480] I love it.
[01:11:04.480 --> 01:11:06.080] Aji, what about you?
[01:11:06.080 --> 01:11:09.280] I just like using these new AI tools that improve my life.
[01:11:09.440 --> 01:11:10.960] Gamma is good.
[01:11:10.960 --> 01:11:13.760] Framer is also good.
[01:11:14.000 --> 01:11:18.960] Get into Lovable because I sort of like say using the hardcore stuff.
[01:11:18.960 --> 01:11:30.320] And so using Lovable felt like a little step down it a little bit, but like Olama, LM Studio, these are ways that I used to do my hobby of improving the house.
[01:11:30.800 --> 01:11:35.360] But also, I'm super into local models.
[01:11:35.680 --> 01:11:40.800] These open source models, running them, because I feel like I can put them in different places.
[01:11:41.040 --> 01:11:44.000] So LM Studio and Olama are just favorite things.
[01:11:44.560 --> 01:11:47.200] So yeah, those are some of the things I'm vibing on.
[01:11:47.200 --> 01:11:50.960] Gamma and Lovable get a year free by becoming a paid subscriber of my newsletter.
[01:11:51.120 --> 01:11:52.000] We'll link to that.
[01:11:52.240 --> 01:11:53.040] Little did you know.
[01:11:53.680 --> 01:11:54.800] I heard about Lovable.
[01:11:54.800 --> 01:11:56.000] I didn't know about Gamma.
[01:11:56.480 --> 01:11:56.960] Absolutely.
[01:11:57.920 --> 01:11:59.280] Maybe Framer someday.
[01:11:59.280 --> 01:12:00.360] There's a lot going on there.
[01:12:00.360 --> 01:12:00.760] Check it out.
[01:12:00.760 --> 01:12:01.800] Lenny'snewsletter.com.
[01:12:01.800 --> 01:12:03.000] Click product pass.
[01:12:03.000 --> 01:12:03.400] Okay.
[01:11:59.920 --> 01:12:04.440] Two more questions.
[01:12:04.760 --> 01:12:11.160] Is there a favorite life motto that you two find really useful and come back to in work or in life?
[01:12:11.160 --> 01:12:12.840] So I'll give you, I'll say two.
[01:12:12.840 --> 01:12:14.760] And one has been from childhood.
[01:12:14.760 --> 01:12:20.200] And my mother had a little plaque in our house that said, anything worth doing is worth doing well.
[01:12:20.520 --> 01:12:21.560] And it's so funny.
[01:12:21.560 --> 01:12:27.720] I just, I assume that that's how everybody lives, but I realize now that it isn't.
[01:12:27.720 --> 01:12:30.520] So I come back to that all the time.
[01:12:30.840 --> 01:12:33.400] And then, so I'll say that came from my mother.
[01:12:33.400 --> 01:12:41.400] But the one I think I've adopted and probably just general upbringing is this thing that anywhere you are, make better.
[01:12:41.400 --> 01:12:46.840] So it's like by being there, I've made it better.
[01:12:46.840 --> 01:12:51.240] And it's simple things like you walk into a bathroom, it's nasty, try and clean up if you can't.
[01:12:51.240 --> 01:12:52.920] It's that simple.
[01:12:52.920 --> 01:12:53.480] Beautiful.
[01:12:55.240 --> 01:12:58.760] But it's something I live with and I live by.
[01:13:00.600 --> 01:13:02.440] Audi, what you got?
[01:13:03.080 --> 01:13:07.720] You know, I mentioned one already, which is whenever you wake up is your morning.
[01:13:07.720 --> 01:13:20.680] So like I tried, I try to spend very little time looking backwards, except to mind the lessons of it, and then just move forward because the universe and the forward motion is infinite.
[01:13:20.680 --> 01:13:24.360] And so it's an evil proverb: ogei teta wototogei.
[01:13:24.360 --> 01:13:25.960] So I want to say that.
[01:13:26.680 --> 01:13:38.760] But the other thing I think I learned from my dad is just like learn all the time, but doesn't mean that you should be not confident.
[01:13:38.920 --> 01:13:44.280] You know, it's this idea that there's more knowledge outside your brain than inside it.
[01:13:44.280 --> 01:13:46.160] And that doesn't mean you should be in a crouch.
[01:13:46.160 --> 01:13:53.680] In fact, you should be proud of being called stupid for a second because we're all stupid at something, right?
[01:13:53.680 --> 01:13:55.120] In the sense that we don't know.
[01:13:55.120 --> 01:13:57.680] There's so many things we don't know because we're finite.
[01:13:57.680 --> 01:14:04.480] But do that and still be confident so you can learn but still have a sense of self on what you've accomplished.
[01:14:04.480 --> 01:14:06.400] And so I try to live by that every day.
[01:14:07.040 --> 01:14:10.160] I love that you each have two and they're both so wonderful.
[01:14:11.360 --> 01:14:15.760] Unfortunately, final question: something that I like to do with duo guests.
[01:14:16.480 --> 01:14:23.840] I switch between two types of questions, so I'm going to ask you this one: What's something you love about the other person?
[01:14:24.160 --> 01:14:32.640] I love that Oji lives in the future, and as obviously, I get the benefit of this.
[01:14:33.280 --> 01:14:36.160] We get to plan.
[01:14:36.400 --> 01:14:51.360] He's very intentional about the future, and it just makes us move together with intention for as partners in life, as well as even with our company.
[01:14:51.360 --> 01:14:55.280] I think that is something I really appreciate about him.
[01:14:55.840 --> 01:14:57.360] So sweet.
[01:14:59.600 --> 01:15:02.080] So, there are two things I love about Izina.
[01:15:02.080 --> 01:15:07.200] One is she is, I think she's a very accepting, loving person.
[01:15:07.200 --> 01:15:07.760] She's nice.
[01:15:07.760 --> 01:15:14.720] Like, when I introduce her, I say, Look, you're gonna talk to her and you're gonna forget about me because she's 10 times nicer than I am.
[01:15:15.280 --> 01:15:18.720] I'm not a horrible person, but she's just way nicer than I am.
[01:15:18.720 --> 01:15:20.320] And so, that's that's great.
[01:15:20.320 --> 01:15:26.720] The second thing is something that people don't know about her is that she's a great problem solver, right?
[01:15:26.720 --> 01:15:34.120] Like, she will, you'll throw something at her and the asymmetric thinking is incredible, right?
[01:15:29.680 --> 01:15:36.520] I didn't think of that.
[01:15:36.840 --> 01:15:38.200] It happens to me all the time.
[01:15:38.200 --> 01:15:39.720] So I love that too.
[01:15:40.040 --> 01:15:43.000] Ezini and Aji, this was amazing.
[01:15:43.000 --> 01:15:48.840] We covered so much ground, so many lessons learned over these past 50 years about life and work and PMing and all the things.
[01:15:48.840 --> 01:15:49.960] Two final questions.
[01:15:49.960 --> 01:15:51.960] Where can folks find you online if they want to reach out?
[01:15:51.960 --> 01:15:54.360] And how can listeners be useful to you?
[01:15:54.360 --> 01:15:58.520] They can find us online at www.productmind.co.
[01:15:58.520 --> 01:15:59.480] That's first.
[01:15:59.480 --> 01:16:05.000] That's where we help founders and help companies build better products.
[01:16:05.160 --> 01:16:10.600] They can find us on Substack where we publish as ProductMind.
[01:16:12.120 --> 01:16:13.560] I think those are the main places.
[01:16:13.800 --> 01:16:21.960] We are starting to put content out in YouTube or, you know, just experimenting with how to amplify our voice.
[01:16:21.960 --> 01:16:24.120] But those are the two main places you can find us.
[01:16:24.120 --> 01:16:24.520] Awesome.
[01:16:24.520 --> 01:16:26.040] And we'll link to all that stuff in the show notes.
[01:16:26.040 --> 01:16:27.080] And then, yeah, Ezini.
[01:16:27.240 --> 01:16:28.200] No, same thing.
[01:16:28.200 --> 01:16:31.800] Like our LinkedIn, we always say my DMs are always open.
[01:16:31.800 --> 01:16:34.280] That's what I always tell people at work.
[01:16:34.920 --> 01:16:36.120] You can reach out to us.
[01:16:36.360 --> 01:16:39.000] We're pretty responsive to most people.
[01:16:39.480 --> 01:16:43.800] But yeah, productmind.co is the best way to get a hold of us.
[01:16:43.800 --> 01:16:49.320] How can we be useful to us is like we are so inspired by what you've done with your community.
[01:16:49.640 --> 01:17:01.560] And we're part of a few communities, but we are very interested in partnering with more PMs, especially executive cohort, like who are having a hard time right now.
[01:17:01.560 --> 01:17:13.880] Like partnering with us, calling us, like we want to be able to distribute opportunity that we may not be interested in to PMs who are struggling right now.
[01:17:13.880 --> 01:18:19.280] And sometimes it's not obvious where to pass opportunity that we're not interested in so like i think useful to us is reach out um and there might be intersections that help you or help us so do it amazing and that's through dms on linkedin ideally or through your website yes through our website or through substack or just linked okay all the places yeah incredible uh thank you two so much for being here such a pleasure thank you lenny um we just like what you're doing for our community is incredible so uh hats off i appreciate it hats off good to see the progress over over the years so yeah well done thank you thank you you too okay bye everyone thank you so much for listening if you found this valuable you can subscribe to the show on apple podcasts spotify or your favorite podcast app also please consider giving us a rating or leaving a review as that really helps other listeners find the podcast you can find all past episodes or learn more about the show at lennyspodcast.com see you in the next episode
Prompt 6: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 7: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Full Transcript
[00:00:00.160 --> 00:00:05.520] It really irked me when in 2024 most people were saying AI is here and now the PM job is gone.
[00:00:05.520 --> 00:00:10.720] There were others who were saying AI is here so now we can do the things we don't like to do with AI.
[00:00:10.720 --> 00:00:15.040] What do you find is most common across the companies that are succeeding?
[00:00:15.040 --> 00:00:21.280] Companies that recognize that AI is not this magic thing that you're gonna slather on to your product.
[00:00:21.280 --> 00:00:23.280] The problems are still the problem.
[00:00:23.280 --> 00:00:27.600] Something else that I think is also really important is getting super hands-on with the tech.
[00:00:27.600 --> 00:00:34.880] I've written more code in the last one year than I have in the last 10 years because code is now essentially architecture and English.
[00:00:34.880 --> 00:00:38.800] I'll pick a project that touches a lot of the things that I need to learn.
[00:00:38.800 --> 00:00:41.840] One of the passion projects I have is to automate my house.
[00:00:41.840 --> 00:00:44.880] The idea is that I give the house eyes and ears.
[00:00:44.880 --> 00:00:48.160] The coolest thing is I've specced out a super sensor.
[00:00:48.160 --> 00:00:50.400] It will see people, hear them.
[00:00:50.400 --> 00:00:55.680] It'll sense humidity and temperature and I'm building the hardware myself.
[00:00:56.960 --> 00:01:00.400] Today my guests are Aji and Ezine Udezwe.
[00:01:00.400 --> 00:01:05.920] They are married, both longtime product leaders with over 50 years of combined product experience.
[00:01:05.920 --> 00:01:16.400] They're also the authors of a new book that I love called Building Rocket Ships, Product Management for High Growth Companies, which is a synthesis of their biggest product lessons over the course of their entire career.
[00:01:16.400 --> 00:01:22.240] Aji was chief product officer at Calendly and Typeform and led product teams at Twitter, Atlassian, and Microsoft.
[00:01:22.240 --> 00:01:26.160] Ezine was CPO at WP Engine and VP of Product at ProCore.
[00:01:26.160 --> 00:01:29.840] We chat about what is changing in the role of product management.
[00:01:29.840 --> 00:01:31.600] Also, what is staying the same?
[00:01:31.600 --> 00:01:44.640] We also get into the shift of PM to edge ratios that AI is introducing, the five skills becoming most important in being a successful PM over the coming years, and the single biggest lesson that each of them have learned over the course of their careers.
[00:01:44.640 --> 00:01:48.880] If you enjoy this podcast, don't forget to subscribe and follow it in your favorite podcasting app or YouTube.
[00:01:48.880 --> 00:01:50.720] It helps tremendously.
[00:01:50.720 --> 00:02:05.320] And if you become an annual subscriber of my newsletter, you get a year free of 15 incredible products, including Lovable, Replid, Bolt, NADN, Linear, Superhuman, Descript, Whisperflow, Gamma, Perplexity, Warp, Granola, Magic Patterns, Raycast, Chat PRD, and Mobbin.
[00:01:59.920 --> 00:02:08.920] Head over to lennysnewsletter.com and click product pass.
[00:02:08.920 --> 00:02:13.000] With that, I bring you Aji and Ezine Odeswe.
[00:02:13.000 --> 00:02:15.160] This episode is brought to you by Mercury.
[00:02:15.160 --> 00:02:17.400] I've been banking with Mercury for years.
[00:02:17.400 --> 00:02:20.760] And honestly, I can't imagine banking any other way at this point.
[00:02:20.760 --> 00:02:24.280] I switched from Chase, and holy moly, what a difference.
[00:02:24.280 --> 00:02:30.920] Sending wires, tracking spend, giving people on my team access to move money around, so freaking easy.
[00:02:30.920 --> 00:02:39.320] Where most traditional banking websites and apps are clunky and hard to use, Mercury is meticulously designed to be an intuitive and simple experience.
[00:02:39.320 --> 00:02:48.600] And Mercury brings all the ways that you use money into a single product, including credit cards, invoicing, bill pay, reimbursements for your teammates, and capital.
[00:02:48.600 --> 00:03:05.240] Whether you're a funded tech startup looking for ways to pay contractors and earn yield on your idle cash, or an agency that needs to invoice customers and keep them current, or an e-commerce brand that needs to stay on top of cash flow and excess capital, Mercury can be tailored to help your business perform at its highest level.
[00:03:05.240 --> 00:03:09.240] See what over 200,000 entrepreneurs love about Mercury.
[00:03:09.240 --> 00:03:12.920] Visit Mercury.com to apply online in 10 minutes.
[00:03:12.920 --> 00:03:18.600] Mercury is a fintech, not a bank, banking services provided through Mercury's FDIC insured partner banks.
[00:03:18.600 --> 00:03:21.000] For more details, check out the show notes.
[00:03:21.000 --> 00:03:27.160] My podcast guests and I love talking about craft and taste and agency and product market fit.
[00:03:27.160 --> 00:03:28.840] You know what we don't love talking about?
[00:03:28.840 --> 00:03:29.720] SOC2.
[00:03:29.960 --> 00:03:31.400] That's where Vanta comes in.
[00:03:31.400 --> 00:03:39.320] Vanta helps companies of all sizes get compliant fast and stay that way with industry-leading AI, automation, and continuous monitoring.
[00:03:39.320 --> 00:03:49.760] Whether you're a startup tackling your first SOC 2 or ISO 27001 or an enterprise managing vendor risk, Vanta's trust management platform makes it quicker, easier, and more scalable.
[00:03:50.080 --> 00:03:56.560] Vanta also helps you complete security questionnaires up to five times faster so that you can win bigger deals sooner.
[00:03:56.560 --> 00:04:05.920] The result, according to a recent IDC study, Vanta customers slashed over $500,000 a year and are three times more productive.
[00:04:05.920 --> 00:04:08.320] Establishing trust isn't optional.
[00:04:08.320 --> 00:04:10.400] Vanta makes it automatic.
[00:04:10.400 --> 00:04:15.120] Get $1,000 off at Vanta.com/slash Lenny.
[00:04:18.000 --> 00:04:22.480] Aji and Ezine, thank you so much for being here and welcome to the podcast.
[00:04:22.480 --> 00:04:24.400] Thank you for having us, Lenny.
[00:04:24.400 --> 00:04:26.240] It's absolutely my pleasure.
[00:04:26.240 --> 00:04:29.040] You guys are a rare guest duo.
[00:04:29.040 --> 00:04:30.320] You two are married.
[00:04:30.320 --> 00:04:32.400] You're both longtime product leaders.
[00:04:32.400 --> 00:04:36.080] You've been in product combined for over 50 years.
[00:04:36.080 --> 00:04:43.120] The cool thing about the fact that you guys have been in product for so long is that you've seen a lot of change in the role of a PM and the role of building products.
[00:04:43.120 --> 00:04:45.360] You've also seen what doesn't change.
[00:04:45.360 --> 00:04:49.440] And so let me actually start here, and I'll throw this to Ezina.
[00:04:49.440 --> 00:04:56.720] What are you seeing changing most in the role and day-to-day job of a product manager?
[00:04:56.720 --> 00:04:58.800] And what are you seeing staying the same?
[00:04:58.800 --> 00:05:06.000] At a high level, the core value that a PM has to deliver hasn't changed.
[00:05:06.000 --> 00:05:17.040] It really is de-risking the product delivery process while also really trying to maximize the value the business gets from these investments.
[00:05:17.040 --> 00:05:35.080] But tactically, I think that the good thing is that we're finding out that PMs are freed up more, and as a result, they can actually invest more in true developing true insights about their customer.
[00:05:35.400 --> 00:05:47.800] So you're seeing or you're finding out your engineering leaders, engineering team is they're asking you to really confirm your insight and your instinct about the customer.
[00:05:47.800 --> 00:05:56.520] The second thing I would say is that there's a need to orchestrate a little differently.
[00:05:56.520 --> 00:06:02.120] So in the past it was around people orchestrating, people getting into the room, et cetera.
[00:06:02.120 --> 00:06:04.520] But now there are many moving pieces.
[00:06:04.920 --> 00:06:19.480] There's the software, there are also the feedback loops, the LLMs that are the multiple touch points that you have in technology, ensuring that those are also being built in a way that allows for true learning within your software.
[00:06:19.480 --> 00:06:22.760] So that idea of working static software isn't true anymore.
[00:06:22.760 --> 00:06:29.560] It's almost like living, breathing software, and ensuring that the things that allow it to breathe actually occur.
[00:06:29.560 --> 00:06:39.400] The other piece, two other pieces, data literacy, an understanding of what it is that your product is learning from.
[00:06:40.440 --> 00:06:43.800] Where is the data going and how is it being used?
[00:06:43.800 --> 00:06:52.760] How is it being organized so that it could be leveraged for future insight about your product use and customer use?
[00:06:52.760 --> 00:07:03.560] And then the last I would say is just creating the right guardrails within your product to ensure, that it actually continues to do what it needs to do.
[00:07:03.560 --> 00:07:08.440] You know, we spoke earlier, and I have a whole thing about ethics of AI, et cetera.
[00:07:08.440 --> 00:07:12.840] And I think that we have a responsibility as product people.
[00:07:12.840 --> 00:07:19.120] And I'm seeing PMs ask the questions that we didn't get to see them ask when we were building social media.
[00:07:14.840 --> 00:07:21.040] So, those are the kind of things I would say.
[00:07:21.360 --> 00:07:23.200] I would definitely want to come back to that topic.
[00:07:23.200 --> 00:07:29.760] There's something you just talked about that connects so deeply with something a recent guest talked about, Asha Sharma.
[00:07:29.760 --> 00:07:45.680] She's a chief vice president at Microsoft for Gandhi AI, and she had this concept that products are evolving from product as artifact, where it's this like done thing you ship to a product as an organism that continues to evolve based on data that it's feeding and this metabolism of data.
[00:07:46.000 --> 00:08:00.480] In our book, you'll see we talk about systems, and I have this innate passion around the idea of what a system is, and it has to do with that living, breathing, and interaction with every other thing.
[00:08:00.480 --> 00:08:02.480] So, yes, yes, yes, and yes.
[00:08:02.800 --> 00:08:05.440] Let me throw over to Aji this first point you made.
[00:08:05.760 --> 00:08:16.400] I love this idea that PMs, there's more time spent almost at kind of the top of the funnel, the top of the product development lifecycle, confirming insights from customer data, deciding what to build.
[00:08:16.560 --> 00:08:26.240] There's talk recently of just PMs almost becoming the bottleneck for teams because engineers are moving so much faster and PMs are the same, just like sitting, maybe docs are easier to write.
[00:08:27.280 --> 00:08:28.320] What do you see there?
[00:08:28.320 --> 00:08:37.520] What do you think about just the, I don't know, this ratio shift potentially for PMs almost become a bottleneck with AI accelerating other functions?
[00:08:37.840 --> 00:08:43.280] For the last 20 years, we've had like very fixed ratios in terms of how what gets done.
[00:08:43.280 --> 00:08:48.320] Like, PM takes this amount of time, developers take this amount of time, go to market takes amount of time.
[00:08:48.320 --> 00:08:58.480] And I think that contract is basically exploding because the build process, you know, sort of the whole solutionary process of the build process is accelerating so fast.
[00:08:58.480 --> 00:09:05.320] And so a company we worked with told us that, you know, they're sort of like a build contract build company.
[00:09:05.640 --> 00:09:12.680] They said they would get off the phone from a pitch, and in four hours they would have a prototype.
[00:09:12.680 --> 00:09:15.320] And so the question is, like, what is a PM doing?
[00:09:15.320 --> 00:09:20.840] The person listening to that is half PM, half developer, in order to be able to produce that.
[00:09:20.840 --> 00:09:26.200] And so what we're seeing is that the way, you know, PMs do three things.
[00:09:26.200 --> 00:09:28.840] We think about what the customers want that is profitable.
[00:09:28.840 --> 00:09:33.320] So working on sharp problems and defining sharp problems and what's in the customer's mind and makes it sharp.
[00:09:33.320 --> 00:09:36.120] We support the solutioning and the build of it.
[00:09:36.120 --> 00:09:40.760] And then we don't do this enough, but we need to support the go-to-market of it.
[00:09:40.760 --> 00:09:45.400] And so what's happening is that this middle thing is becoming more capable and much faster.
[00:09:45.400 --> 00:09:48.120] And so the way we support it, it has to change.
[00:09:48.360 --> 00:09:52.200] PRD writing is insufficient at the moment.
[00:09:52.840 --> 00:09:59.720] You know, sort of like the static way we ask questions about what's in the customers' minds is like that stuff needs to be like faster.
[00:09:59.720 --> 00:10:01.640] The cycle time is so fast now.
[00:10:01.640 --> 00:10:03.240] And so PMs need to adapt.
[00:10:03.240 --> 00:10:07.960] And that needs new tool sets, new skill sets, and so on and so forth.
[00:10:07.960 --> 00:10:10.680] I think that's where the most pressure is.
[00:10:10.680 --> 00:10:19.640] But at the same time, like Azina said, PMs can take some of the, if they adapt to this really well, they can spend a lot more time here, right?
[00:10:19.640 --> 00:10:22.120] And that's where there's a lot of opportunity.
[00:10:22.120 --> 00:10:28.840] And, you know, the thing that we want PMs to do is not be the crouch, not worry about people saying, oh, you're the bottleneck.
[00:10:28.840 --> 00:10:35.880] No, don't be the bottleneck, first of all, but also since you see everything, add value in other places where traditionally you ever spend time.
[00:10:35.880 --> 00:10:38.360] I want to pull on this sharp problem thread.
[00:10:38.360 --> 00:10:40.360] This is my favorite Audi meme.
[00:10:40.360 --> 00:10:43.560] We talked about this a bit when you visited the podcast last.
[00:10:43.560 --> 00:10:45.680] I think it's such a useful concept.
[00:10:45.680 --> 00:10:48.400] So I just want to make sure people learn this while we're talking about it.
[00:10:44.840 --> 00:10:51.040] Talk about what a sharp problem is and why that's important.
[00:10:51.360 --> 00:11:05.040] One of the things I learned from failing a lot at building companies is that sometimes, you know, the way that YC other people pitch it is that you build something and if it doesn't work, you pivot, da-da-da, you know, stuff like that.
[00:11:05.040 --> 00:11:19.120] But it turns out that in the world of high technology and the world of figuring out what customers need, there are many shortcuts that mean that pivoting isn't just like a habit, something you walk around drunkenly doing, right?
[00:11:19.120 --> 00:11:24.240] A lot of the most successful people, you know, Bill Gates, Facebook, and so they didn't even pivot.
[00:11:24.240 --> 00:11:28.000] They had an idea, but it turned out that the idea was a sharp problem.
[00:11:28.000 --> 00:11:39.360] So the way to avoid drunken startup building is pick things that are old.
[00:11:40.240 --> 00:11:42.080] They're core needs.
[00:11:42.080 --> 00:11:48.880] And the way you profit is when you reimagine old needs in new technological ways.
[00:11:48.880 --> 00:11:50.160] And that's important.
[00:11:50.160 --> 00:11:53.200] Now, let's frame it in the sharp problem.
[00:11:53.200 --> 00:11:57.200] What do people feel they need help with?
[00:11:57.200 --> 00:11:58.720] What is still difficult?
[00:11:58.720 --> 00:12:12.560] To the point that if you improve it three to five times, maybe 10x, or you take cost out of it, like 10x to cost, right, one tenth, they'll be like, this is so compelling, you should take my money now.
[00:12:12.560 --> 00:12:16.320] And that's what Lisbon should really, really focus on if they can.
[00:12:16.640 --> 00:12:28.480] There's extended stuff we've talked about over the years, like the Unicorn framework, which is like, if you're building for B2B specifically, how do you draw a quadrant around the problem so you know if it's highly frequent?
[00:12:28.480 --> 00:12:32.040] Because frequency also means a lot of pain and figure it out.
[00:12:29.760 --> 00:12:33.960] But that's something people can read on the substar.
[00:12:34.200 --> 00:12:37.880] Yeah, and we talked about that in our last chat, so I'll point people to that.
[00:12:37.880 --> 00:12:54.040] Okay, so building on this idea of being a drunken walk of figuring out what to build, you have this concept of the shipyard, and this is kind of a way to think about how to think about what product development might look like going forward and just a model for how to work in this chaotic world that we're in.
[00:12:54.040 --> 00:12:57.400] Talk about what that looks like in the advice here.
[00:12:57.400 --> 00:12:59.080] I'm going to give some credit.
[00:12:59.080 --> 00:13:07.080] When I was at Lassian, the first time I encountered the idea was with Jeff Redford, he's now like a VC, and he talked about it.
[00:13:07.080 --> 00:13:12.200] But you know, I've taken the idea because I loved it to maybe a whole different level, maybe it says a clarity.
[00:13:12.200 --> 00:13:15.880] A shipyard is to evoke controlled chaos.
[00:13:15.880 --> 00:13:30.120] If you go to a shipyard at peak, you know, Long Beach or whatever it is, there's so much going on and like big heavy machinery moving around, trucks and containers from China, from Malaysia jumping around.
[00:13:30.120 --> 00:13:38.280] But this chaos underlying it is careful communication, high skill, right?
[00:13:38.520 --> 00:13:46.280] So, what looks like you know, Brownian motion is actually orchestrated progress.
[00:13:46.280 --> 00:13:54.760] And, you know, these are these shipyards at the center of GDP for a lot of nations because this is where trade happens.
[00:13:54.760 --> 00:13:58.040] So, this is what it's supposed to evoke: controlled chaos.
[00:13:58.040 --> 00:14:01.720] And so, what a shipyard team looks like is a six-person team.
[00:14:01.720 --> 00:14:14.120] It is PM, it is engineering, it is design, it is user research, it is data, ML, AI expertise, AI PMs, or whatever you call it.
[00:14:14.120 --> 00:14:19.280] It is product marketing in a pod, problem solving.
[00:14:19.280 --> 00:14:25.520] You know, in this, especially in the AI age, the idea of stand-ups isn't really even make as much sense, right?
[00:14:25.520 --> 00:14:33.600] These people should be, you know, communicated and collaborating hourly basically to solve the problem because especially they're working in new problem states.
[00:14:33.600 --> 00:14:35.840] So I think that's what the core shipyard team means.
[00:14:35.840 --> 00:14:44.000] And then there are these tendrils to, you know, we call salespeople, customer success people, the nervous, sort of the skin.
[00:14:44.000 --> 00:14:47.040] Like, think about the shipyard team as the brain, but this is the skin.
[00:14:47.040 --> 00:14:49.120] This is how you feel customers.
[00:14:49.120 --> 00:14:52.240] And so here's a concrete example.
[00:14:52.560 --> 00:14:59.360] Before I shipped anything at Caldly, I did the design review with a support manager.
[00:14:59.600 --> 00:15:09.520] The support manager had to be in the room because the design people and engineering people, as much as we cared, we shipped problems all the time, you know, featured that.
[00:15:09.520 --> 00:15:12.400] Where the support manager looks at it and is like, that's not going to work.
[00:15:12.400 --> 00:15:16.640] I've had a thousand conversations and it was so useful.
[00:15:16.640 --> 00:15:21.440] And so having the shipyard team have the tendrils of connection to customer teams was so important.
[00:15:21.440 --> 00:15:24.000] So what I'm hearing is essentially embrace chaos.
[00:15:24.000 --> 00:15:25.600] Things are going to get only weirder.
[00:15:25.600 --> 00:15:31.520] The something, I forget who, maybe the CPO at OpenAI said just like it's only going to get weirder.
[00:15:31.520 --> 00:15:34.560] This is the least, this is the most normal the world will be.
[00:15:35.440 --> 00:15:35.680] Yep.
[00:15:36.000 --> 00:15:49.840] And that's important because if the world is weird and you want to make sense of it, you need people who are highly motivated, have the right skill sets to continuously create it as it moves, as it gets weird.
[00:15:49.840 --> 00:15:57.520] Audi, you said EP, I remember Lenny asked how many, you know, what is what is shipyard comprised of?
[00:15:57.520 --> 00:15:59.360] He said six people person team.
[00:15:59.360 --> 00:16:02.760] I know that's going to come up, and people are going to go, sorry, maybe why?
[00:16:03.400 --> 00:16:04.520] It's six person.
[00:16:04.680 --> 00:16:06.200] Six capability team.
[00:15:59.920 --> 00:16:06.280] Okay.
[00:16:06.760 --> 00:16:08.200] It is not a person thing.
[00:16:08.520 --> 00:16:11.480] It is engineering, however many you need.
[00:16:11.480 --> 00:16:13.080] Again, it's all about admission.
[00:16:13.080 --> 00:16:14.840] It's PM, it's design.
[00:16:15.160 --> 00:16:17.640] It's the capabilities I'm optimizing for, right?
[00:16:18.040 --> 00:16:22.840] Not that data and ML, it's product marketing and it's a user research.
[00:16:23.000 --> 00:16:24.200] PMs can't do it.
[00:16:24.200 --> 00:16:25.640] You need the capabilities.
[00:16:25.640 --> 00:16:32.440] And whether it's a fractional person within Matrix into that EPD team or 10 people because of developers, that's what you need.
[00:16:32.440 --> 00:16:32.680] Yeah.
[00:16:32.680 --> 00:16:32.920] Yeah.
[00:16:32.920 --> 00:16:36.680] So what I'm hearing there is roles will become blurrier.
[00:16:36.680 --> 00:16:42.200] That PMs need to do more design, engineers need to do more PMing, people need more data skills.
[00:16:42.600 --> 00:16:48.040] So to me, those are the two takeaways: is just learn to be more comfortable with chaotic world.
[00:16:48.200 --> 00:16:51.480] Just AI constants, every week there's like, oh, this is changing everything.
[00:16:51.480 --> 00:16:53.160] Rethink the roadmap.
[00:16:53.160 --> 00:16:56.760] And then also the roles will be less divided.
[00:16:56.760 --> 00:17:00.520] People will need to learn to do many roles and chip in.
[00:17:00.520 --> 00:17:00.920] Yes.
[00:17:00.920 --> 00:17:01.720] Awesome.
[00:17:01.720 --> 00:17:19.320] So building on that, Ezine, when you hire PMs these days, when you're looking for product leaders, what do you, is there something you're looking for more and more because you believe this will be more important to be successful in a world where AI is just changing everything?
[00:17:19.640 --> 00:17:24.040] I think OG started, he laid the foundation when he said it's day one.
[00:17:24.360 --> 00:17:32.840] And when you're working and thinking about building product in this type of era of space, it's important that you remember that.
[00:17:32.840 --> 00:17:41.960] So, I'm going to talk about what it is that I typically will look for from an attitude's values, and behaviors point of view, as well as a skills point of view.
[00:17:41.960 --> 00:17:49.120] I'm currently working with a firm in LA and I'm building curriculum for their PMs as well as helping them recruit.
[00:17:49.440 --> 00:17:56.160] And the things that we've honed in on are curiosity.
[00:17:56.160 --> 00:18:01.840] And I know it's been said over and over again, but it matters more now.
[00:18:01.840 --> 00:18:12.480] It is, is this person humble and curious enough to admit they don't know and are willing to start as a learner?
[00:18:12.480 --> 00:18:21.920] Do they truly have a growth mindset where they are okay being taught by someone else, no matter how senior they have become?
[00:18:21.920 --> 00:18:23.120] That is very important.
[00:18:23.120 --> 00:18:26.640] That humility and curiosity mixed together.
[00:18:26.640 --> 00:18:30.800] The second part is what I'd call agency.
[00:18:30.800 --> 00:18:41.440] Being able to see opportunity and not ask others for permission to go execute on that opportunity, but actually take the opportunity themselves.
[00:18:41.440 --> 00:18:46.000] Have a strong, there's another word most people use for this.
[00:18:46.320 --> 00:18:49.200] It's not coming to me, but just take a bull.
[00:18:49.360 --> 00:18:49.920] Pull by the horn.
[00:18:49.920 --> 00:18:52.720] But no, there's another characteristic value, behavior.
[00:18:53.200 --> 00:19:00.640] But high agency, being able to just move forward and feel, act like an owner, right?
[00:19:00.960 --> 00:19:02.960] Ownership, that's the word.
[00:19:03.280 --> 00:19:04.800] High ownership.
[00:19:05.280 --> 00:19:16.560] But I think ownership is seeing that you can run, but I think agency assumes that you control, you're more of a thermostat than a thermometer.
[00:19:16.560 --> 00:19:23.680] And that means that, hey, the thermometer is always measuring the temperature of the room, while a thermostat will change the temperature of the room.
[00:19:23.680 --> 00:19:26.080] So I'm looking for that in people.
[00:19:26.080 --> 00:19:29.840] And I want people who can move from a position of strength.
[00:19:30.200 --> 00:19:36.360] It really irked me when in 2024, most people were saying AI is here, and now the PM job is gone.
[00:19:36.360 --> 00:19:45.000] And there were others who were saying AI is here, so now we can do the things we don't like to do with AI and then go focus on the things that we need to do.
[00:19:45.000 --> 00:19:49.080] So people who spoke that way are the types of people I want on my team.
[00:19:49.080 --> 00:19:54.920] From a skills perspective, obviously, an understanding of data, right?
[00:19:54.920 --> 00:20:01.080] Just a high level of how is it organized, how is it leveraged in this new world of AI.
[00:20:01.400 --> 00:20:04.920] There's this skill of being able to write evals.
[00:20:05.080 --> 00:20:14.360] I know everybody can write prompts, prompt engineering, you can try and focus the LLM so that it can offer better insights and offer better results.
[00:20:14.360 --> 00:20:26.520] But even as your LM actually produces results, you need to verify that it actually is smart and it hasn't hallucinated or gone off by creating true evals, right?
[00:20:26.840 --> 00:20:28.440] Those are some hardcore skills.
[00:20:28.440 --> 00:20:41.560] Being able to constrain hallucination, knowing what ways you do that, being able to use multiple models and figure out what is good about this model versus that.
[00:20:41.880 --> 00:20:44.920] One of the things I love to talk about is what does intelligence mean?
[00:20:44.920 --> 00:20:48.120] Intelligence, often people say, is connecting dots.
[00:20:48.120 --> 00:21:00.360] And the beauty of what it is that we have an opportunity to do is that with AI, you can find, you can train it so that it's very specific to a very particular task, right?
[00:21:00.360 --> 00:21:02.760] It can become really smart in one area.
[00:21:02.760 --> 00:21:15.520] And the idea now is: how do you combine the smarts of what maybe Claude does, right, with something that GPT does better and make something even greater than each of them?
[00:21:16.000 --> 00:21:19.600] So that's one of the things I'm looking for in a PM.
[00:21:19.600 --> 00:21:27.920] And then being able to tweak for best for performance, whether it's choosing, whether it's just fine-tuning, doing the weight adjustments.
[00:21:27.920 --> 00:21:29.760] Oji is very passionate about this one.
[00:21:29.760 --> 00:21:31.360] We talk about it quite a bit.
[00:21:31.840 --> 00:21:38.960] But that's the attitudes point of view where, you know, just strong ownership, strong agency, curiosity, being humble.
[00:21:38.960 --> 00:21:47.280] And then this idea of understanding beyond prompts, like, you know, really knowing that AI is just a toolbox, right?
[00:21:47.280 --> 00:21:49.040] And it can make major mistakes.
[00:21:49.040 --> 00:22:03.520] And how do you constrain it such that constrain it, optimize it, tweak it so that it can actually work on your behalf and not like it, you need to make it work for you versus being lost in all the things that it's able to do.
[00:22:03.520 --> 00:22:12.400] I love this answer, and it resonates so well with so many other people's advice, which is curiosity, skills that matter most.
[00:22:12.400 --> 00:22:17.920] And this actually curiosity comes up a lot when I ask people what skills are you focusing on teaching your kids.
[00:22:18.240 --> 00:22:22.320] The number one skill people mention is curiosity.
[00:22:22.320 --> 00:22:23.840] I love that you included humbleness.
[00:22:23.840 --> 00:22:24.880] I want to come back to that.
[00:22:24.880 --> 00:22:34.400] So essentially, curiosity, humbleness, agency, such a meme these days and what PMs need to build and be strong at high agency.
[00:22:34.400 --> 00:22:35.680] So great.
[00:22:35.680 --> 00:22:37.520] And then evals, I like that a lot.
[00:22:37.520 --> 00:22:38.640] That's also super common.
[00:22:38.640 --> 00:22:46.240] I have a really cool guest post coming out really soon that'll teach you more than you ever thought you needed to know about how to become really good at evals.
[00:22:46.240 --> 00:23:02.280] But I want to double down on something you just said, which I love so much, which is this idea of instead of waiting to see what jobs AI takes, it's almost like the way you phrased it essentially: is what jobs can you take as a PM?
[00:23:02.280 --> 00:23:03.240] What can you do?
[00:23:03.240 --> 00:23:04.360] Can you do more engineering?
[00:23:04.360 --> 00:23:05.800] Can you get more design?
[00:23:05.800 --> 00:23:07.640] Such an empowering way to think about it.
[00:23:08.280 --> 00:23:10.600] Let me talk about something that's really personal.
[00:23:11.240 --> 00:23:21.560] I'm part of the women in product organization, and I was asked to guess right end of last year because there was a lot of fear in the community.
[00:23:21.560 --> 00:23:25.640] And there was this thing called the rise of the CTPO.
[00:23:25.640 --> 00:23:37.880] It's this thing, it's been in the background there, where I think many venture VCs, many PEs were asking, do I need a CPO as well as a CTO?
[00:23:37.880 --> 00:23:38.440] Right?
[00:23:38.760 --> 00:23:49.880] And I was always amazed at how many people were leaned back in this thing versus actually showing and proving the value of what a CPO could be.
[00:23:49.880 --> 00:23:58.120] So in many places, it is right that maybe you do only want one person, particularly when it's very well-known spaces, right?
[00:23:58.120 --> 00:24:01.240] It's clear, they're not doing anything revolutionary.
[00:24:01.240 --> 00:24:05.240] But it's like saying an engineer is the same as a product person when you think about it that way.
[00:24:05.240 --> 00:24:14.920] There is a conflict between the CPO and the CTO that actually drives innovation, in my opinion, because one person is trying to optimize the resourcing, the fixed, right?
[00:24:14.920 --> 00:24:21.800] While another person is actually trying to field opportunities and figure out how to wait and decide what is best to do it.
[00:24:21.800 --> 00:24:24.120] At the higher level, those are their jobs.
[00:24:24.120 --> 00:24:33.160] And that tension is really good because you need them to sharpen each other as you try to maximize the value of your organization, of the organization.
[00:24:33.160 --> 00:24:51.840] I say all this to say that there are folks who are not stopping to figure out how do we make the pie bigger, but they're looking at instead how do we divide this pie as it gets smaller, versus asking the question of Rimi, what is the opportunity to increase this for all of us?
[00:24:51.840 --> 00:24:54.880] And that is a skill set I also look out for.
[00:24:54.880 --> 00:24:55.280] Awesome.
[00:24:55.600 --> 00:24:58.800] I'm going to go to Yaji in a second, but I want to come back on this humble idea.
[00:24:58.800 --> 00:24:59.840] I haven't heard this before.
[00:25:00.560 --> 00:25:03.600] Why do you find it really important for people to be humble?
[00:25:03.600 --> 00:25:05.920] I haven't heard that piece of advice yet.
[00:25:05.920 --> 00:25:07.520] Oh my gosh.
[00:25:07.840 --> 00:25:11.120] Particularly now, it is really important to be.
[00:25:11.520 --> 00:25:14.480] Because I would say, okay, I went to school.
[00:25:14.480 --> 00:25:18.640] I actually did neural networks in school when I was doing a project.
[00:25:18.640 --> 00:25:24.800] And, you know, you get to this part of your career where you go, I kind of know a lot of what product is.
[00:25:24.800 --> 00:25:38.720] And I do see folks who are not willing to go sign up for that new junior AI PM course because they're CPO or they're head of product or they're X, Y, or Z, or they're a principal PM now.
[00:25:39.040 --> 00:25:40.560] And I think that that's a mistake.
[00:25:40.560 --> 00:25:43.760] I think that it's important to realize that you don't know everything.
[00:25:43.760 --> 00:25:45.440] You can't know everything.
[00:25:45.440 --> 00:25:59.920] And even as a leader, CPO, head of product, et cetera, I come from this school where I believe that in order to lead an organization, sometimes you actually do need to know how to do the work.
[00:25:59.920 --> 00:26:02.000] And I think I lead better that way.
[00:26:02.000 --> 00:26:07.920] So being able to sign up and say, you know what, let me sit in and figure out how we're going to build this thing.
[00:26:07.920 --> 00:26:17.360] And number one, learn for yourself, but also figure out what questions other PMs are asking in that course, right?
[00:26:17.360 --> 00:26:31.000] So it's about taking courses from people you may have considered junior, figuring out and sitting and reading books that you thought you, you know, just going back to the day one of how do you build product in an AI world.
[00:26:31.000 --> 00:26:36.360] So that's where humility comes to play in this period where we are.
[00:26:36.360 --> 00:26:42.680] It's you just don't know everything, can't know everything, and things will change faster than you're able to stay ahead of.
[00:26:42.680 --> 00:26:45.800] So lean on your team as well.
[00:26:45.800 --> 00:26:48.360] I just had this guy, Chip Connolly, on the podcast.
[00:26:48.360 --> 00:26:49.720] He's in his 60s.
[00:26:49.720 --> 00:26:53.560] He joined Airbnb when he was 52 as an intern.
[00:26:53.560 --> 00:26:57.480] And he said exactly the same thing to be successful.
[00:26:57.480 --> 00:27:03.880] And it's interesting as a metaphor, as an older person, you need to be really good at being open to learning from people younger than you.
[00:27:03.880 --> 00:27:16.280] And that's a really interesting, I think, lens and heuristic almost for just people that aren't as AI savvy yet, just leaning into that idea of just like, okay, I can actually learn a lot from people around me, even if they're much younger than me.
[00:27:16.280 --> 00:27:32.360] Okay, so Aji, something else that I think that you're really good at and you've been spending time on that I think is also really important is getting super hands-on with the tech, not just like reading and not just listening to podcasts like this, not just reading newsletters, but like actually building stuff.
[00:27:32.360 --> 00:27:40.040] Talk about just like why that's important, what you've been doing to learn and to stay ahead and not just be like a pontificate or guy in the clouds.
[00:27:40.040 --> 00:27:42.760] I just love everything we've said in the last few minutes.
[00:27:43.640 --> 00:27:48.600] I think the shorthand I have for humility is humility is teachability.
[00:27:49.480 --> 00:27:52.200] And when do you need teachability?
[00:27:52.200 --> 00:28:01.240] You need teachability when everything has changed, where there is no blueprint.
[00:28:01.240 --> 00:28:09.800] And whatever you think is a blueprint of AI in the last few years, you know, whether, oh, wow, Lavable is so successful, the Red Player is so successful.
[00:28:09.800 --> 00:28:14.120] It's actually quite possible that any one of those companies won't be around in two, three years, right?
[00:28:14.120 --> 00:28:18.400] Because we're still writing the playbook for what success looks like.
[00:28:18.720 --> 00:28:24.320] So humility is teachability, and teachability is survivability if you're thinking about careers.
[00:28:24.320 --> 00:28:29.280] And if you also, if you're thinking about building something that's important, it's also important like that.
[00:28:29.680 --> 00:28:38.160] In terms of how to keep your hand sharp, what I would say is a couple of things.
[00:28:38.160 --> 00:28:49.600] One is, I have written, Isaiah and I both have a master's engineering, but we very quickly figured out that the thing that we were gifted at was helping to invent the future.
[00:28:49.600 --> 00:28:53.280] Where customers are thinking, you know, how do we work with developers to do that?
[00:28:53.280 --> 00:28:55.760] And so we've been PMs forever.
[00:28:55.760 --> 00:28:59.440] And being a PM means that even though we could, we don't write code.
[00:28:59.440 --> 00:29:01.920] We work with developers, we work with other people in the market.
[00:29:01.920 --> 00:29:12.720] But I've written more code in the last one year than I have in the last 10 years because code is now essentially architecture and English or whatever language that you have.
[00:29:12.720 --> 00:29:13.840] And so what does that mean?
[00:29:13.840 --> 00:29:16.880] I've taken the opportunity to write more code.
[00:29:16.880 --> 00:29:20.960] I've taken the opportunity to convert PRD writing into prototype writing.
[00:29:20.960 --> 00:29:28.400] I've taken the opportunity to write more API interfaces and actually call those API interfaces with Postman myself.
[00:29:28.400 --> 00:29:39.280] I've taken the opportunity to subscribe to all the AI tools, right, and take that as a cost of learning so that I can understand what each model is useful for.
[00:29:39.280 --> 00:29:44.560] I've taken the opportunity to understand MCPs and connect different things to it.
[00:29:44.560 --> 00:29:54.080] Like right now, one of my pet projects, or one of the things I do in order to learn, is I pick a project that it's almost like writing an eval, what's an eval for my life to learn.
[00:29:54.240 --> 00:30:01.720] I'll take a pick a project that touches a lot of the things that I need to learn that I'm interested and passionate about.
[00:29:59.120 --> 00:30:05.560] And as I get into it, because of my passion, I start to learn more and more.
[00:30:05.880 --> 00:30:10.760] And so, one of the passionate purposes I've had I have is to automate my house.
[00:30:10.760 --> 00:30:23.240] So, I've been doing this for years and years, but now I'm trying to build a very smart house run on a processor that has AI chip and inference in it while the house looks dumb.
[00:30:23.240 --> 00:30:25.400] So, because I don't want anyone to know about it.
[00:30:25.400 --> 00:30:26.520] And so, what am I doing?
[00:30:26.520 --> 00:30:29.160] I'm getting models.
[00:30:29.160 --> 00:30:33.960] I am learning about how to, what quantization of those models means.
[00:30:33.960 --> 00:30:36.600] I am learning how to fine-tune some of them.
[00:30:36.600 --> 00:30:37.320] I'm feeding them.
[00:30:37.320 --> 00:30:41.720] I'm building an MCP server in my house that talks to the home entities.
[00:30:41.720 --> 00:30:47.720] And then I can collaborate with Claude on how to build and make it sort of super dynamic.
[00:30:48.040 --> 00:31:00.920] But beyond that, you know, one of the things we're spending time on is Advent Studio, I'm raising capital for Adventure Studio because that's what I want to spend the next decade on building seminar AI companies beyond the book.
[00:31:00.920 --> 00:31:11.240] I'm collaborating with some of the smartest people in AI right now who can teach me how to do this, how to write agents, what agents really mean.
[00:31:11.560 --> 00:31:21.640] And so, everything, as I said, writing evals, being humble, going back after 25 years of being a PM to being teachable is how I stay sharp.
[00:31:21.640 --> 00:31:23.400] I want to go back to this house.
[00:31:24.200 --> 00:31:25.720] What is your house doing?
[00:31:26.120 --> 00:31:27.880] How smart is this house?
[00:31:28.440 --> 00:31:41.880] The idea is that I give the house eyes and ears, and it senses the coolest thing is I've specced out what I call like a super sensor.
[00:31:41.880 --> 00:31:51.920] It will see people, it will feel their heat, it will hear them, it will sense humidity and temperature, and it will just be scattered around the house, right?
[00:31:52.080 --> 00:31:54.960] And I'm building the hardware myself.
[00:31:54.960 --> 00:31:59.440] And basically, as you walk into the house, it starts to adapt itself to you, right?
[00:31:59.440 --> 00:32:01.840] It manages energy in the house.
[00:32:01.840 --> 00:32:03.200] It knows you're in a room.
[00:32:03.200 --> 00:32:06.560] If you leave, my kids are not in their rooms right now.
[00:32:06.560 --> 00:32:11.520] Well, the HVAC around them just stopped and said, oh, they're not here.
[00:32:11.760 --> 00:32:16.720] When they come back, before they come back, an hour before they come back, it starts to pre-cool their space.
[00:32:16.720 --> 00:32:19.040] And then when they come back, it says, oh, they're here.
[00:32:19.040 --> 00:32:20.560] And it starts to do cool things.
[00:32:20.560 --> 00:32:23.520] So there's a lot of stuff like that, that a building.
[00:32:23.520 --> 00:32:26.160] And the question is, you don't want that to be completely inflexible.
[00:32:26.160 --> 00:32:28.320] You want it to be super dynamic and super human.
[00:32:28.320 --> 00:32:30.160] And so that's part of the AI.
[00:32:30.400 --> 00:32:32.560] I am the recipient of all of this.
[00:32:35.440 --> 00:32:36.720] I woke up at 1 a.m.
[00:32:36.800 --> 00:32:37.840] and I want toast.
[00:32:37.840 --> 00:32:39.360] Why is the toast not working?
[00:32:39.680 --> 00:32:42.240] That's because you shut off the power.
[00:32:42.480 --> 00:32:43.040] That's not good for you.
[00:32:45.440 --> 00:32:50.800] No, but look at, look, Isime walks into lots of rooms in the house and the light just turns on.
[00:32:50.800 --> 00:32:54.880] And when she leaves, she just walks out and then it knows she's not there and it turns it off.
[00:32:54.880 --> 00:32:55.920] And she's taking it for granted.
[00:32:56.240 --> 00:32:57.680] Well, it's actually doing all this.
[00:32:57.680 --> 00:32:58.320] Wow.
[00:32:58.560 --> 00:32:58.800] It is.
[00:33:00.400 --> 00:33:03.360] But there's also like a horror movie version of this.
[00:33:03.360 --> 00:33:04.800] Yes, there is.
[00:33:07.040 --> 00:33:13.200] Well, what I will say is that my family should not piss me off because I could do crazy things to them.
[00:33:14.800 --> 00:33:15.840] Oh, man.
[00:33:16.160 --> 00:33:16.800] Okay.
[00:33:17.760 --> 00:33:21.840] And this is built on like an LLM that you trained.
[00:33:22.320 --> 00:33:23.600] Is that what the powers is?
[00:33:23.920 --> 00:33:28.560] So multiple levels, there's this open source software called Home Assistant that I use.
[00:33:28.560 --> 00:33:31.400] And you can, it's very extensible.
[00:33:29.120 --> 00:33:34.840] So I can put in like fine-tuned LLMs.
[00:33:34.920 --> 00:33:38.120] I can use the big LLMs.
[00:33:39.080 --> 00:33:44.760] They're things like smaller spoken whisper models you can use to, like I can just say, hey, Jarvis.
[00:33:44.760 --> 00:33:50.120] like Iron Man, and the house will start talking to me and telling me things about what's going on.
[00:33:50.120 --> 00:33:51.480] So things like that.
[00:33:52.200 --> 00:33:53.800] Getting us back on track.
[00:33:54.680 --> 00:33:56.520] You guys are geeking out right now.
[00:33:58.120 --> 00:34:01.160] I'm excited and scared to visit this house.
[00:34:01.800 --> 00:34:03.400] That's good, my dad.
[00:34:04.680 --> 00:34:05.720] Get this out of me.
[00:34:05.960 --> 00:34:23.000] What I love about this is this is something that I've seen a bunch is having a very specific problem to solve for yourself is a really good way to build and get into these tools because you're motivated to solve a problem and it's like a specific thing that you're doing versus just generally, I'm going to go sit around and play with Lovable.
[00:34:23.000 --> 00:34:25.720] No, I think I want to double click on that, Lenny.
[00:34:25.720 --> 00:34:35.880] I was telling you the other time that we spoke that I get a lot of questions from, I would say, people who aren't very familiar with AI and they don't know where to start.
[00:34:35.880 --> 00:34:37.560] And that is exactly what I say.
[00:34:37.560 --> 00:34:40.680] It's like, hey, I know you love fashion, for example, right?
[00:34:40.680 --> 00:34:44.040] Or you like reading, right?
[00:34:44.360 --> 00:34:50.680] And you have the, you actually have the database of the entire Austin Library or whatever it is online.
[00:34:50.680 --> 00:34:54.760] And you can pull it if you really want to, and it can recommend based on your feelings.
[00:34:54.760 --> 00:34:57.960] So you can create a project around what it is that you're passionate about.
[00:34:57.960 --> 00:35:02.040] One particular woman, older lady, wanted to figure this out.
[00:35:03.080 --> 00:35:08.280] Figure out a passion project, and I knew she liked this idea called Outfit of the Day.
[00:35:08.280 --> 00:35:26.320] So with LLMs, she started a project where she actually had, she took the time to take all her tops, all her bottoms, all her dresses, all her shoes, and now she's working on, based on the temperature, having recommendations of her outfit for the day.
[00:35:26.320 --> 00:35:38.320] So that's one way she was able to find something that she loved and could take a, you know, leverage AI to build that could help her determine, okay, what should I wear or what are the options I should consider for today?
[00:35:38.320 --> 00:35:39.920] That's an amazing example.
[00:35:39.920 --> 00:35:45.040] Can I just make like one, you know, I'm always like a high concept prediction guy.
[00:35:45.040 --> 00:35:47.600] So I'm going to make one prediction.
[00:35:47.600 --> 00:35:58.640] I think that, you know, in the next few years, a lot of high agency people, not just PMs, will essentially be writing the software that automates their lives.
[00:35:59.120 --> 00:36:09.760] And so the one-man SaaS is going to go away because the ability to, it'll be hard to begin on developers because we will, people who really care will automate their lives.
[00:36:09.760 --> 00:36:12.400] They'll build applications that just work for them.
[00:36:12.400 --> 00:36:17.600] And the advice for PMs is that we should be the vanguard with that.
[00:36:17.600 --> 00:36:24.400] Because if we can start doing that for ourselves, we'll be able to do that for software that hits a million people because that's the way to get started.
[00:36:24.400 --> 00:36:26.160] And we also want to offer hope.
[00:36:26.160 --> 00:36:38.160] It's still day one, meaning that no matter how far, if you hear me and hear about fine-cheating and home assistance and all this crazy hardware that I'm trying to build, and you're scared, you're like, I'm behind.
[00:36:39.120 --> 00:36:40.240] You're not behind.
[00:36:40.240 --> 00:36:41.040] You're not behind.
[00:36:41.040 --> 00:36:48.320] People who seem like they're succeeding, we were here when the internet dropped and everyone, you know, Microsoft thought it was behind.
[00:36:48.320 --> 00:36:51.600] And, you know, with social, people, some people thought they were behind.
[00:36:51.600 --> 00:36:57.280] The people who seem like they're off the off-the-gate fastest, whether it's a company or person, not the people who win.
[00:36:57.280 --> 00:37:01.560] And so it's the wisest and the most dedicated to win.
[00:36:59.840 --> 00:37:03.480] And so people should just start.
[00:37:03.720 --> 00:37:08.280] You know, evil people, we have the proverb that says, whatever you wake up is your morning.
[00:37:08.280 --> 00:37:10.920] And so just wake up and go.
[00:37:11.160 --> 00:37:14.040] I love so much of the advice you two are sharing.
[00:37:14.920 --> 00:37:21.080] There's a post I'm going to link to in the show notes that shares a bunch of examples of what people are by coding building with AI.
[00:37:21.080 --> 00:37:22.920] That seems to be the biggest challenge for a lot of people.
[00:37:22.920 --> 00:37:24.680] It's just like, what should I actually do?
[00:37:24.680 --> 00:37:32.200] And something that I realized as I was putting that together is people think about like, oh, can you build real products and businesses on bold, lovable replic?
[00:37:32.440 --> 00:37:36.600] But interestingly, most people just want to build a thing that solves a specific problem for themselves.
[00:37:36.600 --> 00:37:41.960] And sometimes that turns into a thing that there's almost a bigger opportunity just to build all this personal software.
[00:37:42.440 --> 00:37:47.960] Similar to the example you shared, Ezine, there's a site, someone built this, like, how many layers should I wear today?
[00:37:48.200 --> 00:37:49.160] com or whatever.
[00:37:49.160 --> 00:37:52.040] I don't think that's the actual domain, and it's just like three.
[00:37:52.360 --> 00:37:54.520] And that's sometimes all you need to know.
[00:37:54.520 --> 00:37:57.080] Today's episode is brought to you by Coda.
[00:37:57.080 --> 00:38:02.440] I personally use Coda every single day to manage my podcast and also to manage my community.
[00:38:02.440 --> 00:38:06.520] It's where I put the questions that I plan to ask every guest that's coming on the podcast.
[00:38:06.520 --> 00:38:08.440] It's where I put my community resources.
[00:38:08.440 --> 00:38:10.120] It's how I manage my workflows.
[00:38:10.120 --> 00:38:11.960] Here's how Coda can help you.
[00:38:11.960 --> 00:38:15.240] Imagine starting a project at work and your vision is clear.
[00:38:15.240 --> 00:38:19.640] You know exactly who's doing what and where to find the data that you need to do your part.
[00:38:19.640 --> 00:38:29.480] In fact, you don't have to waste time searching for anything because everything your team needs from project trackers and OKRs to documents and spreadsheets lives in one tab all in Coda.
[00:38:29.480 --> 00:38:41.720] With Coda's collaborative all-in-one workspace, you get the flexibility of docs, the structure of spreadsheets, the power of applications, and the intelligence of AI, all in one easy-to-organize tab.
[00:38:41.720 --> 00:38:49.200] Like I mentioned earlier, I use Coda every single day, and more than 50,000 teams trust Coda to keep them more aligned and focused.
[00:38:49.200 --> 00:38:55.920] If you're a startup team looking to increase alignment and agility, Coda can help you move from planning to execution in record time.
[00:38:55.920 --> 00:39:02.480] To try it for yourself, go to coda.io/slash Lenny today and get six months free of the team plan for startups.
[00:39:02.480 --> 00:39:08.640] That's coda.io/slash lenny to get started for free and get six months of the team plan.
[00:39:08.640 --> 00:39:10.960] Coda.io/slash Lenny.
[00:39:11.280 --> 00:39:13.440] So, let me take this opportunity to zoom out a little bit.
[00:39:13.440 --> 00:39:18.720] We've been talking a lot about personal skills, what you think people need to get better at, what it takes to be successful.
[00:39:18.720 --> 00:39:20.560] I want to talk about at the company level.
[00:39:20.560 --> 00:39:24.560] You two work with a bunch of different companies, you've both worked at a bunch of different companies.
[00:39:24.560 --> 00:39:47.040] When you look at the companies that seem to be doing well in this new AI world, adopting the best practices, seeing productivity gains, versus the companies that are just struggling and not getting anywhere and just having a hard time, what do you find is most common across the one, the companies that are succeeding, shifting their culture and the way they work in this world of AI?
[00:39:47.040 --> 00:39:49.280] And I'll throw this to Esina first.
[00:39:49.600 --> 00:40:08.800] I think the main thing I will say is recognizing companies that recognize that AI is a core set of multiple capabilities that they've been gifted.
[00:40:08.800 --> 00:40:18.560] I think that you need to understand that it is not this magic thing that you're going to slather on to your product and it will do perfectly.
[00:40:18.560 --> 00:40:20.800] The problems are still the problems.
[00:40:20.800 --> 00:40:24.240] The customer is still the customer.
[00:40:24.240 --> 00:40:29.200] And their pain or sharp problems still exist as they have.
[00:40:29.560 --> 00:40:42.040] The idea now is: how do you take AI and not have it be something we call it, you put at the edge alone, but you actually re-transform the way you solve the customer's problem.
[00:40:42.040 --> 00:40:45.880] So we have this phrase called AI at the core and AI at the edge.
[00:40:45.880 --> 00:40:53.240] And the best way to think about it is that with AI at the edge, you're probably still using software as you always have.
[00:40:53.240 --> 00:40:56.280] The bits and bytes that you wrote before still are there.
[00:40:56.280 --> 00:41:04.360] You're instead inserting LLMs or AI at different intersections or connection points.
[00:41:04.360 --> 00:41:05.000] Okay?
[00:41:05.640 --> 00:41:28.840] AI at the core means fundamentally looking at the problem space and the workflows and using AI to solve the problem, not just sprinkling it at the intersections of GUIs or user interfaces that are out there or using it to just accelerate things.
[00:41:28.840 --> 00:41:44.920] So the way we talk about it is that companies where their code base remains and they are attaching AI to it often aren't the ones who are going to revolutionize their industry.
[00:41:45.240 --> 00:41:57.480] Companies for whom their code base perhaps shrinks and the LLM becomes a core part of what it is that they use to solve the problem are companies that actually are doing it right.
[00:41:57.800 --> 00:42:03.880] Another characteristic is specificity.
[00:42:04.520 --> 00:42:15.000] We've seen a lot of companies want to do so many things and expect LLMs to be as broad as the problem sets they want to solve for.
[00:42:15.920 --> 00:42:33.840] And we, in our experience, have seen that to do that, you actually will need to have specialized first and then create a layer of a connective tissue that can offer intelligence.
[00:42:33.840 --> 00:42:41.360] So we've seen companies instead just stay on top, try to build this massive solution set or LLM that can do it all.
[00:42:41.360 --> 00:42:54.960] But the successful ones that we've consulted with, partnered with, are talking to are those who instead have seen, you know what, I can find, I can build, with this team that I have on here, can actually build this very specific solution set.
[00:42:54.960 --> 00:42:56.240] This other team can do this.
[00:42:56.240 --> 00:43:07.440] And then my job is to tie it all together with a broader multi-model solution and build it and create an offering for the customer.
[00:43:07.440 --> 00:43:08.640] I think those are two core things.
[00:43:08.640 --> 00:43:11.040] I'm sure Oji can remember a few others.
[00:43:11.760 --> 00:43:17.600] No, I think, no, I love, I was humble in listening to you.
[00:43:19.200 --> 00:43:25.360] No, so maybe I'll embellish the point.
[00:43:25.680 --> 00:43:44.400] We're seeing companies that are starting from a blank slate, taking an old problem, a sharp problem, and saying, if we built it today with LLM as the core capability, as a core sort of blob of code, how would it look?
[00:43:44.400 --> 00:43:51.920] And often these people are finding a lot of acceleration and they're actually like taking on adjacencies.
[00:43:51.920 --> 00:44:10.920] So in a formless type form, you know, a new product built by David Okunev and me and a small group of people, we built a new type form and we ended up building more than it because it also became like a sales lead agent.
[00:44:10.920 --> 00:44:13.400] So like it did pre-sales.
[00:44:13.400 --> 00:44:19.560] which ordinarily Typeform gets you into qualification before you talk to a salesperson, but the product itself did pre-sale.
[00:44:19.560 --> 00:44:22.120] So that capability was so strong.
[00:44:22.440 --> 00:44:24.920] We're also seeing this shrinking thing.
[00:44:25.560 --> 00:44:33.320] It's funny, the last week I've seen companies that have been around for five years who, once LLMs came, they just reimagined their product.
[00:44:33.320 --> 00:44:37.880] Like Clay.ai is a six-year-old company, super successful.
[00:44:38.120 --> 00:44:45.720] So in the first category is like things like lovable, but in the second category of people who are like, okay, we're going to rethink this and we're going to do it.
[00:44:46.040 --> 00:44:52.760] Now, the problem for the rest of the industry is this, you know, the innovators dilemma.
[00:44:52.760 --> 00:44:55.800] They have lots of revenue tied to old code bases.
[00:44:55.800 --> 00:45:06.920] And so their job is to, and you know, we tell PM leaders all the time, is to navigate how you use LLMs on the old thing, but navigate to the new thing before someone eats your lunch.
[00:45:06.920 --> 00:45:09.800] And it's not, it's complicated, it's hard, but you have to do it.
[00:45:09.800 --> 00:45:12.840] Now, let me make maybe a couple of smaller things.
[00:45:13.800 --> 00:45:17.480] The people who are succeeding are taking chances on user experiences.
[00:45:17.480 --> 00:45:23.640] We don't believe the chat interface is the final boss for AI user experience.
[00:45:23.640 --> 00:45:31.400] We think that there's a reason UIs exist and the command line didn't work, otherwise, DOS will be the biggest operating system.
[00:45:31.640 --> 00:45:36.040] And so people have to do different things and be dynamic.
[00:45:36.080 --> 00:45:40.520] You know, one of the big things is dynamic user experiences that just personalize to the customer.
[00:45:40.680 --> 00:45:44.000] So we see a lot of people succeeding there.
[00:45:43.640 --> 00:45:51.120] And one of the things that we don't see people doing actually that we want people to do is something as mentioned, which is ethics.
[00:45:53.040 --> 00:46:05.120] We think of AI and its core capabilities as ordnance level, meaning like at some point we'll think of this as more powerful than the fusion, the fusion bomb.
[00:46:05.440 --> 00:46:13.360] And we have a bunch of software people who have no consideration of ethics play with this stuff all the time.
[00:46:13.360 --> 00:46:18.160] And so one thing we'd like to see the best companies do is lead on that.
[00:46:18.160 --> 00:46:24.800] Like think about the responsibility we have to the human race when we build new digital products or we give them superpowers.
[00:46:24.800 --> 00:46:27.680] I want to zoom out even further.
[00:46:28.000 --> 00:46:33.440] So we started talking about individual people, skills, what's going to be most important.
[00:46:33.440 --> 00:46:40.880] We've been talking just now about what companies are doing well, the ones that are succeeding in this crazy world of AI.
[00:46:40.880 --> 00:46:43.600] I want to now zoom out just your entire product career.
[00:46:43.600 --> 00:46:58.720] I'm curious just when you look back at the 50 plus years of combining experience you two have, what are some of the biggest product lessons that you've learned that you would love to share that you think people can avoid having to learn the hard way?
[00:46:58.720 --> 00:47:03.040] I think what I want to reiterate the thing that we talked about, sharp problems.
[00:47:04.000 --> 00:47:07.600] The problem you focus on is probably the most predictive of your success.
[00:47:07.600 --> 00:47:08.800] Pick a sharp problem.
[00:47:08.800 --> 00:47:13.520] Pick something that if you really solve it, people will come.
[00:47:14.400 --> 00:47:18.160] Watching Jeff Bezos from the 80s feels prescient.
[00:47:18.160 --> 00:47:20.400] He's like, look, people always shop.
[00:47:20.400 --> 00:47:21.840] I'm going to be the guy.
[00:47:22.640 --> 00:47:23.760] And he did it.
[00:47:23.760 --> 00:47:25.600] So do that.
[00:47:25.920 --> 00:47:35.640] And I say this humbly because I've failed, because sometimes we build things that we are passionate about, but being a founder is not an exercise.
[00:47:35.640 --> 00:47:40.280] And even an operator in a company, it's not an exercise in your passion and your brilliance.
[00:47:40.280 --> 00:47:42.920] It's an exercise in finding what the market needs.
[00:47:42.920 --> 00:47:43.640] So do that.
[00:47:43.640 --> 00:47:45.400] Be disciplined and doing that.
[00:47:45.400 --> 00:47:47.720] We also talk about simplicity a lot.
[00:47:48.040 --> 00:47:53.400] Even this morning, I was talking to a designer and he had a convoluted user experience.
[00:47:53.400 --> 00:47:58.040] And I said, only 20% of people who come to this will see that.
[00:47:58.040 --> 00:48:04.840] And so the point of design is first simplicity and clarity before we get fancy.
[00:48:04.840 --> 00:48:06.600] And so there's a lot.
[00:48:06.600 --> 00:48:09.400] Steve Jobs embodied this fairly on.
[00:48:09.400 --> 00:48:11.960] I think iOS has become complicated.
[00:48:11.960 --> 00:48:14.760] But like, start with simplicity.
[00:48:14.760 --> 00:48:16.840] And simple is hard.
[00:48:18.040 --> 00:48:21.560] And in 2025, we're designing for distracted brains.
[00:48:21.880 --> 00:48:23.000] A couple of things are happening.
[00:48:23.000 --> 00:48:24.600] People are super distracted.
[00:48:24.600 --> 00:48:26.440] People are no longer wild by technology.
[00:48:26.440 --> 00:48:28.200] You call it AI all you want.
[00:48:28.200 --> 00:48:30.760] They just assume that it is what it is.
[00:48:30.760 --> 00:48:35.960] And so simplicity is really good for 2025 and beyond, basically.
[00:48:37.480 --> 00:48:40.360] And then, you know, we talk a lot about fundamentals.
[00:48:40.360 --> 00:48:41.800] I'm going to throw this to you, is in there.
[00:48:41.800 --> 00:48:45.160] Like, there are ways to figure out the fundamentals.
[00:48:45.160 --> 00:48:51.160] What we do, I know we operate in the software layer, but it's most basic.
[00:48:51.160 --> 00:48:56.280] We're trying to predict what humans would do, what they will want, how their workflow will go.
[00:48:56.280 --> 00:49:09.080] And there's some, you know, there's a layer of that that is quite similar, you know, because our friends, the anthropologists and the psychologists, really understand this.
[00:49:09.080 --> 00:49:11.480] And I think if we understand it, we can build better software.
[00:49:11.480 --> 00:49:17.200] So, but isn't Izan, I want to let you, I'm just going to throw this up to you so I don't take up all the space.
[00:49:14.680 --> 00:49:17.680] Go for it.
[00:49:17.920 --> 00:49:38.480] No, I think you're on to something about just so when I think about the hard lessons I've learned, you talked about simplicity, and I'm saying this also from humility and just having learned a lot from past failures.
[00:49:38.800 --> 00:49:49.040] One of the reasons people create complicated solutions is because they are afraid to take a stab, to put their point of view.
[00:49:49.200 --> 00:49:50.720] Opinion out there.
[00:49:50.800 --> 00:49:51.280] Not opinion.
[00:49:52.000 --> 00:49:54.000] They're not opinionated enough.
[00:49:54.000 --> 00:50:01.280] And that often comes from not having high conviction because you're not sure that you've spent the time with the customer.
[00:50:01.280 --> 00:50:10.480] So, with that, I know it sounds trite, but spend time and understand the customer or borrow from others who do.
[00:50:10.480 --> 00:50:19.440] But you've got to not just lay it all out and have them choose which way because it is so effing confusing.
[00:50:19.440 --> 00:50:22.160] 10,000 options because you can't pick one.
[00:50:22.160 --> 00:50:26.080] Yeah, the experience is so murky, chalky.
[00:50:26.080 --> 00:50:30.640] Nobody knows what the hell to do because you can't make a decision.
[00:50:30.640 --> 00:50:31.920] Make it simple.
[00:50:32.320 --> 00:50:35.440] Come up with an opinion and put your opinion out there.
[00:50:35.440 --> 00:50:36.880] You can change.
[00:50:36.880 --> 00:50:39.840] Leave the configurations and the options behind.
[00:50:39.840 --> 00:50:41.760] But leave them behind the scenes if you must.
[00:50:41.760 --> 00:50:48.400] But try and create the most compelling, the simplest experience, and you will actually have better adoption.
[00:50:48.400 --> 00:50:51.200] You're better off doing that and being wrong.
[00:50:51.200 --> 00:51:03.080] Many times we leave and ship convoluted experiences because we don't have the courage, number one, to make a decision, create a point of view, and ship it out there.
[00:51:03.400 --> 00:51:14.120] The reality is, and my experience has shown that you're better off picking an opinion, shipping it out there, being wrong, and then adjusting.
[00:51:14.120 --> 00:51:20.920] that leaving too many options because you then can never learn what was the better experience overall.
[00:51:20.920 --> 00:51:23.160] So that's one thing.
[00:51:23.160 --> 00:51:29.720] The other major lesson I've learned is actually one about strategy.
[00:51:29.720 --> 00:51:41.240] It is that the best way to go from strategy to execution in a way that activates the entire organization is communication.
[00:51:41.480 --> 00:51:53.640] You will never, never spend too much time communicating the why to your organization over and over and over again.
[00:51:53.880 --> 00:52:02.920] I don't remember what movie it was, but I think it was the movie about sending someone to the moon, right?
[00:52:05.480 --> 00:52:06.760] I'm forgetting what it is.
[00:52:06.760 --> 00:52:12.840] But the idea that everybody in NASA knew that we were working to send a man to the moon.
[00:52:12.840 --> 00:52:15.480] Even the janitor did, knew that.
[00:52:15.480 --> 00:52:21.960] So that idea of ensuring that everybody in the team understands the why behind what we're doing and can speak to it.
[00:52:21.960 --> 00:52:24.040] I saw something on LinkedIn the other day.
[00:52:24.040 --> 00:52:28.280] I think it was Martin Erickson who actually pushed up, posted this.
[00:52:28.280 --> 00:52:42.520] He talked about how the same way we talk about crossing the chasm and figuring out who it is that will adopt a product, we should think about the change management required for strategy communication through that lens.
[00:52:42.520 --> 00:52:44.800] There will be early adopters.
[00:52:44.360 --> 00:52:45.920] Think about those people.
[00:52:46.240 --> 00:52:57.360] Those people will hear the message, the strategy, and by the time others are just beginning to understand what it is, they're probably on the next version of the strategy, asking what can change, what should change.
[00:52:57.360 --> 00:53:04.000] And that was so clarifying for me because I always wondered why strategy could get muddy sometimes.
[00:53:04.000 --> 00:53:09.040] It's because when you look at your entire population, about 5% are going to be the early adopters.
[00:53:09.040 --> 00:53:12.960] They get it, they buy into it, and they're moving.
[00:53:12.960 --> 00:53:17.440] Four months later, they're onto something wondering what the next version of the strategy will look like.
[00:53:17.440 --> 00:53:19.200] How do we build on top of this?
[00:53:19.200 --> 00:53:23.280] At that same time, some people are just beginning to get it.
[00:53:23.600 --> 00:53:36.480] And honestly, that answered a lot of questions for me as to how it is that you can be working on a strategy, some people get it, can recite it, tell you why we're doing what we're doing, while some PMs are still struggling with, well, why?
[00:53:36.800 --> 00:53:37.600] How come?
[00:53:37.600 --> 00:53:38.560] Can we do this?
[00:53:38.560 --> 00:53:41.440] Because they are, in fact, laggards out there.
[00:53:41.440 --> 00:53:56.000] So, that idea of trying to really activate a strategy into pure execution, if you think about communication as a critical element, and then also look at it as how many people, what is the crossing the chasm version?
[00:53:56.000 --> 00:53:59.360] Who is at what point in this change management journey?
[00:53:59.360 --> 00:54:05.440] I think that is actually, that was one of the major lessons I've learned as a leader in product.
[00:54:05.440 --> 00:54:14.320] I love these sorts of conversations where you two spend 50 years learning and then you just tell us, here's all the answers, here's a bunch of stuff that'll save you a bunch of pain and suffering.
[00:54:14.320 --> 00:54:17.520] Let me try to summarize what you shared and tell me what I missed.
[00:54:17.520 --> 00:54:30.000] So, some of the biggest things you two have learned over the past 50 years, and it's not in the right order, but communicating the why, making sure that you can never spend too much time helping people understand why what you're working on is important.
[00:54:30.600 --> 00:54:37.960] And then there's just the change management component of not underestimating the work that it'll take to convince people to adopt something.
[00:54:37.960 --> 00:54:39.960] And then, this whole idea of crossing the chasm.
[00:54:39.960 --> 00:54:44.440] We had Jeffrey Moore on the podcast, so we'll point to that if you want to deeply understand what that's all about.
[00:54:44.440 --> 00:54:57.640] Then, this lesson of simplicity, just keep per just the value of keeping things simple, easier said than done, and having an opinion, having the courage almost to do something really simple, not give people all the options.
[00:54:57.640 --> 00:55:05.320] And then, this Aji, you meant you talked about the idea of most times when something fails, it's the wrong, you're going after the wrong problem.
[00:55:05.320 --> 00:55:10.280] It's not sharp enough, or it's just not solving people, something people actually care about.
[00:55:10.600 --> 00:55:16.840] And then, just talking to customers, always easy to say, most people don't actually do it enough.
[00:55:16.840 --> 00:55:19.160] I want to add one more thing, and I'll be brief.
[00:55:19.160 --> 00:55:23.320] This is more about people's careers, like PMs.
[00:55:23.320 --> 00:55:36.280] In our career, we have, you know, we started out as engineers basically, became mainline PMs, got an MBA, became, you know, worked harder, became executives, and so on and so forth.
[00:55:36.280 --> 00:55:40.600] So, some, you know, people look at my career as an escrow and they're like, oh, this is amazing.
[00:55:40.600 --> 00:55:44.920] But behind that is a lot of intention.
[00:55:44.920 --> 00:55:51.240] Like, we always held in our brains what do we want to do next?
[00:55:51.240 --> 00:55:54.040] Like, what is the next step?
[00:55:54.040 --> 00:55:57.000] What is the hunger that drives us?
[00:55:57.000 --> 00:56:03.000] And there's nothing more powerful than intention, than like imagination.
[00:56:03.000 --> 00:56:08.840] It's like, you know, a little bit of cheese in front of the mouse, like, it chases after.
[00:56:08.840 --> 00:56:16.880] Being able to visualize and chase the thing that you see about where you want to be is so powerful.
[00:56:14.840 --> 00:56:19.680] I talked to a guy who gave me a break.
[00:56:19.920 --> 00:56:27.120] So when I was, you know, the last two years of my time at Microsoft, I switched to become a marketing lead.
[00:56:28.080 --> 00:56:32.640] I was a little tired of PM at Microsoft specifically, and I was getting an MBA.
[00:56:32.960 --> 00:56:34.720] And I talked to Dave Mendelin.
[00:56:34.880 --> 00:56:42.240] He was vice president, you know, or senior director in the, I worked, you know, eventually for Satya.
[00:56:42.560 --> 00:56:46.320] And he gave me a break and became a product marketer for a couple of years.
[00:56:46.320 --> 00:56:56.880] But he came back and talked to me this year and he said, OG, everything you told me you wanted to do from afar, I saw that you have done.
[00:56:57.200 --> 00:56:59.520] And you told me about them 10 years ago.
[00:56:59.520 --> 00:57:01.200] And I didn't even know that.
[00:57:01.200 --> 00:57:03.040] And so it's a very powerful thing.
[00:57:03.040 --> 00:57:08.000] I think that this is not just about how to build better products, but how to manage your own career is super important.
[00:57:08.000 --> 00:57:12.320] I just went to a talk, storytelling event, and there was a very similar theme.
[00:57:12.320 --> 00:57:15.280] This guy just always wanted to be a stand-up comedian.
[00:57:15.280 --> 00:57:20.960] And he, the lesson he learned, is just you need to, the best motivator is desire.
[00:57:22.080 --> 00:57:26.960] Having that desire not fade, because that'll, it's kind of like going back to our idea of the project.
[00:57:26.960 --> 00:57:33.200] Having a project to work on because you need a problem solved is a really good way to just get to where you want to go.
[00:57:33.200 --> 00:57:33.680] Yeah.
[00:57:33.920 --> 00:57:36.240] Ezina, I think you had something else you wanted to add.
[00:57:36.240 --> 00:57:40.560] Yeah, you know, you said this thing about, you know, know the customer, know the customer.
[00:57:40.560 --> 00:57:47.120] And I think there's a lot of info, there's a lot of noise out there about customer interviews, et cetera, right?
[00:57:47.120 --> 00:57:52.720] But I do want to remind people that there's a difference between what the customer says, right?
[00:57:52.720 --> 00:57:55.920] Customer discovery, asking them things, right?
[00:57:56.240 --> 00:57:58.400] And AI is really good at that right now.
[00:57:58.400 --> 00:58:00.760] Like, you can help you get all the transcriptions.
[00:57:59.920 --> 00:58:03.720] But just remember that what they do is actually more important.
[00:58:04.520 --> 00:58:18.920] So earlier in my career, I had the opportunity to have a UX lab that I worked in, and ethnographic research, watching what people truly were doing, still is top-notch.
[00:58:18.920 --> 00:58:20.920] It's the best thing you can ever do.
[00:58:20.920 --> 00:58:23.640] So, what does that look like digitally?
[00:58:23.640 --> 00:58:36.200] It's still the core instrumentation, figuring out how people finish things, but also true customer insight comes from not using AI to read all the transcripts of your customer interviews.
[00:58:36.200 --> 00:58:37.960] That isn't what it is.
[00:58:37.960 --> 00:58:41.960] It's actually trying to figure out how to best walk in their shoes.
[00:58:41.960 --> 00:58:47.160] And whatever it takes, you've got to figure out how to do that because it's not what they say they do.
[00:58:47.160 --> 00:58:48.520] It's not what they say they want.
[00:58:48.920 --> 00:58:52.040] That's not the true intimacy you want with a customer.
[00:58:52.040 --> 00:58:59.320] You actually want to observe them as much as possible and understand the why behind the actions they're taking.
[00:58:59.320 --> 00:59:07.320] And I think that's a really important thing to point out right now because I think people are cheating with AI through interviews, et cetera, and thinking that they're going to figure it out.
[00:59:07.320 --> 00:59:19.640] But there's a different type of insight that comes from what I call true ethnographic research, studying, observing, understanding the driver behind their need, their problem or the need that they're trying to solve.
[00:59:20.120 --> 00:59:34.040] And I just want to make sure we say that because I think people think AI has come to save it all, but no, it's going to give you junk because people don't always say what they mean or they don't realize that they're lying about their reasons for doing a thing.
[00:59:34.040 --> 00:59:39.560] I think some of these lessons, especially the last one, is the kind of lesson that you can hear it and be like, yeah, yeah, I get it.
[00:59:39.560 --> 00:59:43.080] And then you have to actually learn it and find the times.
[00:59:43.080 --> 00:59:43.880] Oh, wow, okay.
[00:59:43.880 --> 00:59:46.320] Everyone told me they wanted this thing, and nobody actually used it.
[00:59:46.320 --> 00:59:48.800] Okay, I remember as Ine said that on a podcast.
[00:59:44.920 --> 00:59:50.160] Oh, I get it now.
[00:59:51.040 --> 00:59:54.960] And so I think, you know, there's only so much we can do to help people learn these things.
[00:59:54.960 --> 00:59:58.720] Sometimes you just have to fail to learn.
[00:59:59.360 --> 01:00:03.120] Okay, is there anything that we have not touched on?
[01:00:03.120 --> 01:00:09.120] Anything that you wanted to share before we get to our very exciting lightning round?
[01:00:09.120 --> 01:00:10.640] I think we need to talk about ethics.
[01:00:10.640 --> 01:00:26.640] And as people of color and parents, I think that having a call to action for our PMs to really be thoughtful about what it is they're building and what it is they're integrating into their product is really important to us.
[01:00:26.640 --> 01:00:36.480] We just want to call PMs that this awesome power we have to direct lots of developers comes with responsibility, the Spider-Man trope.
[01:00:36.480 --> 01:00:39.120] And, you know, we created social media.
[01:00:39.120 --> 01:00:40.160] I worked at Twitter.
[01:00:40.160 --> 01:00:43.440] And I always like, oh my God, like, how many mistakes do we make?
[01:00:43.440 --> 01:00:50.880] Like, I remember the person who invented DeepFates was interviewed one time and she said, and they're like, what are the implications of your stuff?
[01:00:50.880 --> 01:00:52.880] And she's like, I don't care.
[01:00:53.200 --> 01:00:55.120] And I was so shocked.
[01:00:55.120 --> 01:00:55.760] I was like, what?
[01:00:55.920 --> 01:00:57.440] Software engineer, you don't care?
[01:00:57.440 --> 01:00:59.280] She's like, oh, it's all evented countermeasure.
[01:00:59.280 --> 01:01:00.320] I'm like, oh, my God.
[01:01:00.640 --> 01:01:02.880] So, PMs, we shouldn't be like that.
[01:01:03.520 --> 01:01:04.560] We shouldn't be like that.
[01:01:04.720 --> 01:01:09.200] I guess the one thing I also wanted to say is like, one pain point we hear a lot is strategy.
[01:01:09.600 --> 01:01:15.840] PMs who are like senior PM, principal PM, worry about, oh my God, how can I do product strategy?
[01:01:16.160 --> 01:01:20.800] And in the book, we talk a lot about the sources of strategy.
[01:01:20.800 --> 01:01:23.840] And it's not mysterious.
[01:01:23.840 --> 01:01:34.280] I think there's things you need to learn about, like the sources of competitive advantage and even what that means, like intellectual property or like economies of scale, economies of scope.
[01:01:34.600 --> 01:01:39.640] Because there's no formal education about this stuff, PMs don't have the fundamentals they need.
[01:01:39.640 --> 01:01:54.600] Because it's easy to learn customer discovery, but like learning the seven levers of competitive advantage, maybe you've never seen, or learning the 10 ways that companies can grow, which is not just product, sometimes it's distribution, sometimes a superior business strategy.
[01:01:54.600 --> 01:02:12.600] And these are ideas that if PMs had good material, it's sort of like packed in the middle of business school, you can actually become a person who helps direct your company on where to go and do it in such a way that's so compelling you can talk about it that you know it's good for your career.
[01:02:12.600 --> 01:02:18.280] I remember Elaine, I think one time I saw maybe a long time ago, I saw something you're written around how do you make money?
[01:02:18.280 --> 01:02:32.520] You know, you sell a thing, you rent a thing, you know, there many people think that it's there are many, many, like it's this amorphous thing, but you may not have the truly exhaustive list, right?
[01:02:32.520 --> 01:02:43.400] But there are a fixed number of things that you can know to create strategy to make money and just doing the work or doing the research to figure out, okay, hey, how does one make money?
[01:02:43.400 --> 01:02:45.480] How does one get competitive advantage?
[01:02:45.480 --> 01:02:50.600] In the book, we wrote about 10 growth levers and seven levels of competitive advantage.
[01:02:50.600 --> 01:03:01.720] And I think if you master that, you have like 80% of your tool set that makes you a compelling director or CPO that can play at that level.
[01:03:02.120 --> 01:03:03.160] And it's super important.
[01:03:03.160 --> 01:03:04.360] Otherwise, you're sort of guessing.
[01:03:04.360 --> 01:03:06.920] You're like, oh, did this deck work for my CEO?
[01:03:06.920 --> 01:03:08.760] No, that's not what you should think about it.
[01:03:08.920 --> 01:03:21.840] I'll point to the post that you're mentioning, Ezine, that is all the ways to make money, but let's use this opportunity to talk about your book, just what people would get out of it, why they should buy it, do the pitch, and then we'll get to the very exciting lightning round.
[01:03:22.160 --> 01:03:31.040] In our career, 50 plus years, when we started this product management thing, we thought, oh, this is going to be mainstream.
[01:03:31.040 --> 01:03:31.840] There's development.
[01:03:31.840 --> 01:03:33.840] There are 30 million developers, whatever.
[01:03:33.840 --> 01:03:36.640] And there's going to be product managers all over the world.
[01:03:36.640 --> 01:03:48.720] But operating in the US, operating in Europe, in Africa, it turns out the best PMs are so ghettoed somewhere in the west coast of the United States, right?
[01:03:48.720 --> 01:03:49.920] We know this.
[01:03:49.920 --> 01:03:51.840] And we don't like that at all, right?
[01:03:51.840 --> 01:03:53.360] We think opportunity should be everywhere.
[01:03:53.360 --> 01:04:06.320] So we wrote Building Rocket Ships to help people figure out how to build not only great products, but the companies that build great products to make sure that people have the knowledge not to be one-hit wonders.
[01:04:06.320 --> 01:04:10.560] And the book is divided into two: one is the fundamentals.
[01:04:10.560 --> 01:04:15.840] If you're a senior to principal PM, the fundamentals of building a great product, right?
[01:04:15.840 --> 01:04:18.720] All the way from simplicity to even pricing.
[01:04:19.040 --> 01:04:24.320] And then the second part is how do you lead a high-performing shipyard?
[01:04:24.320 --> 01:04:40.720] And how do you go from motivating people to charting the course for the future, depending on where you're in the market, to make, you know, like Ezena said, to commanding a shipyard that is kick-ass.
[01:04:41.200 --> 01:04:42.640] That's the book.
[01:04:42.640 --> 01:04:50.080] And, you know, in the latter part, we also talk about what's going to happen in terms of AI over the next few years.
[01:04:50.480 --> 01:04:53.200] So, we are, it's called Building Rocket Ships.
[01:04:53.200 --> 01:04:54.000] We're very proud of it.
[01:04:54.000 --> 01:04:55.520] It's right over here.
[01:04:55.520 --> 01:05:03.160] And we think it's awesome because it represents a lot of the experience that we've had over these years and a lot of the success.
[01:05:03.480 --> 01:05:05.560] And where do folks find it if they want to go check it out?
[01:05:05.560 --> 01:05:08.520] Maybe you can buy one or two or ten or hundred.
[01:05:08.520 --> 01:05:11.480] Some companies have been buying a lot actually in bulk.
[01:05:11.480 --> 01:05:13.240] So, this is a note for the companies.
[01:05:13.240 --> 01:05:15.320] But you can buy it on Amazon.
[01:05:15.560 --> 01:05:17.320] You can find it on Shopify.
[01:05:17.320 --> 01:05:22.680] On Shopify, if you go for it, there's an additional Pro Edition.
[01:05:22.680 --> 01:05:24.840] The Pro Edition is like a team product.
[01:05:25.000 --> 01:05:30.040] The book is on coda.io, and you can share it with your team.
[01:05:30.040 --> 01:05:31.000] You can take notes.
[01:05:31.000 --> 01:05:42.920] There's almost double the amount of content because it contains checklists, templates, tools for PMs that they can download and just use immediately.
[01:05:42.920 --> 01:05:46.840] It's almost like code for you, for your PM career.
[01:05:46.840 --> 01:05:50.680] We have the book, and then we have the Coda version.
[01:05:50.680 --> 01:05:54.200] And one of the things we really worked on was making it actionable.
[01:05:54.200 --> 01:06:07.160] So the minute you get the Coda version, you can read a chapter and then you'll have templates, frameworks, step checklists, whatever it is that we felt makes sense.
[01:06:07.160 --> 01:06:09.320] We made that available via the coda version.
[01:06:09.320 --> 01:06:11.080] What a motherload of value.
[01:06:11.880 --> 01:06:15.240] People always have issues saying the URL on podcasts.
[01:06:15.240 --> 01:06:16.200] I just want to make sure what's that?
[01:06:16.200 --> 01:06:17.400] How do you actually find this version?
[01:06:17.400 --> 01:06:20.600] You said it's on Shopify, but there's probably a domain name to get there.
[01:06:20.600 --> 01:06:33.800] So if you go to www.productmind.co slash brpro, you can buy the Pro Edition.
[01:06:34.120 --> 01:06:41.160] And if you've bought it, you can also use it to log into the coda edition, and you're off to the races.
[01:06:41.160 --> 01:06:41.800] There we go.
[01:06:41.800 --> 01:06:42.280] Amazing.
[01:06:42.280 --> 01:06:44.200] We'll also link to it in the show notes.
[01:06:44.200 --> 01:06:47.760] And with that, we've reached our very exciting lightning round.
[01:06:44.680 --> 01:06:49.440] I've got five questions for you, too.
[01:06:49.440 --> 01:06:50.320] Are you ready?
[01:06:50.640 --> 01:06:51.280] Yes.
[01:06:51.280 --> 01:06:55.760] First question is: what are two or three books you find yourself recommending most to other people?
[01:06:55.760 --> 01:06:56.800] We just did one.
[01:06:56.800 --> 01:06:57.680] Building Rocket Ships.
[01:06:57.760 --> 01:07:01.760] Building Rocket Ships is the first one, so we're not afraid to self-promote.
[01:07:01.760 --> 01:07:04.000] But also, the other one for me is Build.
[01:07:04.000 --> 01:07:05.200] Tony Fidel.
[01:07:05.200 --> 01:07:12.160] You know, I ran into Tony in Nigeria building an accelerator, and it was just incredible book.
[01:07:12.160 --> 01:07:13.840] So I totally recommend it.
[01:07:13.840 --> 01:07:17.520] One for me, and it's both for work and life, etc.
[01:07:17.840 --> 01:07:19.920] This one called The Let Them Theory.
[01:07:19.920 --> 01:07:21.360] I think it was Mel Robbins.
[01:07:21.520 --> 01:07:22.160] I love it.
[01:07:22.160 --> 01:07:34.080] I just think that many times PMs are control freaks or A type and being able to leave people to their own devices sometimes is hard.
[01:07:34.080 --> 01:07:40.640] So let them remind us that we get a lot more rest for ourselves when we let people be.
[01:07:40.960 --> 01:07:42.880] I was just listening to that book on audio.
[01:07:42.880 --> 01:07:46.480] I was like, it's sitting at the top of the bestseller like every freaking week.
[01:07:46.480 --> 01:07:48.000] What is this book?
[01:07:48.320 --> 01:07:48.960] And it's so good.
[01:07:48.960 --> 01:07:50.240] I sat on your background actually.
[01:07:50.240 --> 01:07:51.520] I noticed that you're reading it.
[01:07:51.760 --> 01:07:52.640] Yeah, it's so good.
[01:07:52.640 --> 01:07:53.680] It's such a simple idea.
[01:07:53.840 --> 01:07:56.640] And the thing I learned at listening to it is there's a second part to it.
[01:07:56.640 --> 01:07:57.680] It isn't just let them.
[01:07:57.680 --> 01:07:58.000] I don't know.
[01:07:58.160 --> 01:07:58.400] Let them.
[01:07:59.440 --> 01:08:00.080] Yeah, I did.
[01:08:00.080 --> 01:08:01.200] It's about let me.
[01:08:01.200 --> 01:08:01.760] Yeah, exactly.
[01:08:01.920 --> 01:08:02.240] Let me.
[01:08:02.480 --> 01:08:02.880] Such a guy.
[01:08:02.960 --> 01:08:03.600] I've been using it.
[01:08:03.600 --> 01:08:04.640] It's really powerful.
[01:08:04.640 --> 01:08:05.200] I get why.
[01:08:06.240 --> 01:08:08.800] I didn't, I mean, the book is good.
[01:08:08.800 --> 01:08:10.000] Yeah, let's give her credit.
[01:08:10.000 --> 01:08:14.000] But it's such a simple thing that other people in many religions, et cetera, have used.
[01:08:14.000 --> 01:08:25.520] But she grounds, she grounds, she shows you why it's necessary and how we show up trying to control others in other situations that are very common to most people.
[01:08:25.520 --> 01:08:26.800] So I like that.
[01:08:26.800 --> 01:08:28.400] I never thought of it in the PM context.
[01:08:28.400 --> 01:08:29.240] That is very cool.
[01:08:29.040 --> 01:08:31.000] And that's by Mel Robinson, by the way.
[01:08:31.320 --> 01:08:35.960] Okay, next question: Is there a favorite recent movie or TV show that you really enjoyed?
[01:08:35.960 --> 01:08:38.680] So we watched a few shows together recently.
[01:08:38.680 --> 01:08:45.080] So I would say one that we loved, and we couldn't, we kept saying, gosh, we can't wait for our kids to be old enough to watch.
[01:08:45.080 --> 01:08:47.560] It was one called Forever.
[01:08:48.600 --> 01:08:51.160] It was just really, really good.
[01:08:53.000 --> 01:08:54.200] It's a love story.
[01:08:54.200 --> 01:08:56.920] It's one of these books.
[01:08:57.000 --> 01:09:01.400] It was a book reimagined, done from an African-American point of view.
[01:09:01.400 --> 01:09:03.240] It was just really well done.
[01:09:03.240 --> 01:09:04.600] So that's a TV show.
[01:09:04.600 --> 01:09:06.280] It's called Forever.
[01:09:06.280 --> 01:09:09.720] And another one that we really like, because I think it was Sterling K.
[01:09:09.800 --> 01:09:11.560] Brown is Paradise.
[01:09:11.720 --> 01:09:12.840] Really enjoyed it.
[01:09:12.840 --> 01:09:14.120] It was interesting.
[01:09:14.920 --> 01:09:15.320] Excellent.
[01:09:15.640 --> 01:09:16.200] It kept us.
[01:09:16.520 --> 01:09:17.000] Yeah.
[01:09:17.000 --> 01:09:17.960] Adju, what about you?
[01:09:18.120 --> 01:09:18.920] Mine is on a roll.
[01:09:18.920 --> 01:09:23.240] And the movie that we've been geeking out on is Sinners.
[01:09:23.880 --> 01:09:25.240] We just love Sinners.
[01:09:25.240 --> 01:09:26.200] I think I saw it twice.
[01:09:26.200 --> 01:09:29.240] My son saw it like three or four times.
[01:09:30.040 --> 01:09:33.000] We're just big fans of Koogler and we tell stories.
[01:09:33.000 --> 01:09:34.360] So we love that.
[01:09:34.760 --> 01:09:40.120] I've seen both those two recently, and they both have very unexpected, they've been unexpected directions.
[01:09:40.120 --> 01:09:41.640] I'm like, why did I see this coming?
[01:09:41.960 --> 01:09:42.840] Yes.
[01:09:42.840 --> 01:09:43.640] Yes.
[01:09:43.640 --> 01:09:44.200] Oh, man.
[01:09:44.200 --> 01:09:45.800] Yeah, that movie is beautiful, Sinners.
[01:09:45.800 --> 01:09:46.600] I was like, wow.
[01:09:46.920 --> 01:09:48.600] But I usually hate scary movies.
[01:09:48.600 --> 01:09:49.240] I enjoyed that one.
[01:09:49.240 --> 01:09:52.520] Someone else recently mentioned that movie too, actually, on the podcast.
[01:09:52.520 --> 01:09:56.040] Next question: Is there a favorite product you recently discovered that you really love?
[01:09:56.040 --> 01:09:59.400] Could be a gadget, could be clothes, could be a kitchen device, could be an app.
[01:09:59.400 --> 01:09:59.800] I'll start.
[01:09:59.880 --> 01:10:01.240] I mean, I love Claude.
[01:10:01.240 --> 01:10:03.320] It's like Adi says, it's my sidekick.
[01:10:03.320 --> 01:10:06.360] I use it for anything and everything.
[01:10:07.640 --> 01:10:14.400] But obviously, the thing you learn is: don't, even though you keep talking about it as generative AI, be the generator and let it refine.
[01:10:14.400 --> 01:10:15.600] That's the major trick.
[01:10:15.840 --> 01:10:17.120] That's the hack everybody should learn.
[01:10:17.280 --> 01:10:17.920] Super hack.
[01:10:14.280 --> 01:10:18.560] Don't generate.
[01:10:19.120 --> 01:10:22.320] Generate first and let it refine and let it help you.
[01:10:22.320 --> 01:10:24.240] Otherwise, you're going to get bullshit.
[01:10:24.240 --> 01:10:24.560] Sorry.
[01:10:24.560 --> 01:10:25.760] That's a really good tip.
[01:10:26.400 --> 01:10:32.320] And one, I'll say I have this new Nespresso machine, the Virtuo, that Oji got from me.
[01:10:32.640 --> 01:10:37.120] Everybody is using it now, and I'm kind of pissed, but it's still my favorite product.
[01:10:37.120 --> 01:10:38.800] But everybody's in it.
[01:10:39.200 --> 01:10:42.960] The reason is, and now we like keeping it really clean and shiny.
[01:10:42.960 --> 01:10:46.400] And now that everybody's using it, it's not as clean and shiny as it used to be.
[01:10:46.560 --> 01:10:48.960] When she says everybody, she means everyone else in the house.
[01:10:48.960 --> 01:10:50.320] She wants to use it by herself.
[01:10:50.640 --> 01:10:51.120] Let's be clear.
[01:10:51.360 --> 01:10:53.520] She needs a second version just for herself.
[01:10:53.520 --> 01:10:55.360] Yeah, we need another one.
[01:10:55.360 --> 01:10:58.960] And then the plebs and the pluritorate can use the other ones.
[01:10:58.960 --> 01:10:59.280] That's right.
[01:10:59.280 --> 01:11:00.000] That's what it is.
[01:11:00.400 --> 01:11:00.880] When they're good.
[01:11:00.880 --> 01:11:01.360] When they're good.
[01:11:01.600 --> 01:11:04.000] Make all my lattes, hot, cold, all of that stuff.
[01:11:04.000 --> 01:11:04.480] I love it.
[01:11:04.480 --> 01:11:06.080] Aji, what about you?
[01:11:06.080 --> 01:11:09.280] I just like using these new AI tools that improve my life.
[01:11:09.440 --> 01:11:10.960] Gamma is good.
[01:11:10.960 --> 01:11:13.760] Framer is also good.
[01:11:14.000 --> 01:11:18.960] Get into Lovable because I sort of like say using the hardcore stuff.
[01:11:18.960 --> 01:11:30.320] And so using Lovable felt like a little step down it a little bit, but like Olama, LM Studio, these are ways that I used to do my hobby of improving the house.
[01:11:30.800 --> 01:11:35.360] But also, I'm super into local models.
[01:11:35.680 --> 01:11:40.800] These open source models, running them, because I feel like I can put them in different places.
[01:11:41.040 --> 01:11:44.000] So LM Studio and Olama are just favorite things.
[01:11:44.560 --> 01:11:47.200] So yeah, those are some of the things I'm vibing on.
[01:11:47.200 --> 01:11:50.960] Gamma and Lovable get a year free by becoming a paid subscriber of my newsletter.
[01:11:51.120 --> 01:11:52.000] We'll link to that.
[01:11:52.240 --> 01:11:53.040] Little did you know.
[01:11:53.680 --> 01:11:54.800] I heard about Lovable.
[01:11:54.800 --> 01:11:56.000] I didn't know about Gamma.
[01:11:56.480 --> 01:11:56.960] Absolutely.
[01:11:57.920 --> 01:11:59.280] Maybe Framer someday.
[01:11:59.280 --> 01:12:00.360] There's a lot going on there.
[01:12:00.360 --> 01:12:00.760] Check it out.
[01:12:00.760 --> 01:12:01.800] Lenny'snewsletter.com.
[01:12:01.800 --> 01:12:03.000] Click product pass.
[01:12:03.000 --> 01:12:03.400] Okay.
[01:11:59.920 --> 01:12:04.440] Two more questions.
[01:12:04.760 --> 01:12:11.160] Is there a favorite life motto that you two find really useful and come back to in work or in life?
[01:12:11.160 --> 01:12:12.840] So I'll give you, I'll say two.
[01:12:12.840 --> 01:12:14.760] And one has been from childhood.
[01:12:14.760 --> 01:12:20.200] And my mother had a little plaque in our house that said, anything worth doing is worth doing well.
[01:12:20.520 --> 01:12:21.560] And it's so funny.
[01:12:21.560 --> 01:12:27.720] I just, I assume that that's how everybody lives, but I realize now that it isn't.
[01:12:27.720 --> 01:12:30.520] So I come back to that all the time.
[01:12:30.840 --> 01:12:33.400] And then, so I'll say that came from my mother.
[01:12:33.400 --> 01:12:41.400] But the one I think I've adopted and probably just general upbringing is this thing that anywhere you are, make better.
[01:12:41.400 --> 01:12:46.840] So it's like by being there, I've made it better.
[01:12:46.840 --> 01:12:51.240] And it's simple things like you walk into a bathroom, it's nasty, try and clean up if you can't.
[01:12:51.240 --> 01:12:52.920] It's that simple.
[01:12:52.920 --> 01:12:53.480] Beautiful.
[01:12:55.240 --> 01:12:58.760] But it's something I live with and I live by.
[01:13:00.600 --> 01:13:02.440] Audi, what you got?
[01:13:03.080 --> 01:13:07.720] You know, I mentioned one already, which is whenever you wake up is your morning.
[01:13:07.720 --> 01:13:20.680] So like I tried, I try to spend very little time looking backwards, except to mind the lessons of it, and then just move forward because the universe and the forward motion is infinite.
[01:13:20.680 --> 01:13:24.360] And so it's an evil proverb: ogei teta wototogei.
[01:13:24.360 --> 01:13:25.960] So I want to say that.
[01:13:26.680 --> 01:13:38.760] But the other thing I think I learned from my dad is just like learn all the time, but doesn't mean that you should be not confident.
[01:13:38.920 --> 01:13:44.280] You know, it's this idea that there's more knowledge outside your brain than inside it.
[01:13:44.280 --> 01:13:46.160] And that doesn't mean you should be in a crouch.
[01:13:46.160 --> 01:13:53.680] In fact, you should be proud of being called stupid for a second because we're all stupid at something, right?
[01:13:53.680 --> 01:13:55.120] In the sense that we don't know.
[01:13:55.120 --> 01:13:57.680] There's so many things we don't know because we're finite.
[01:13:57.680 --> 01:14:04.480] But do that and still be confident so you can learn but still have a sense of self on what you've accomplished.
[01:14:04.480 --> 01:14:06.400] And so I try to live by that every day.
[01:14:07.040 --> 01:14:10.160] I love that you each have two and they're both so wonderful.
[01:14:11.360 --> 01:14:15.760] Unfortunately, final question: something that I like to do with duo guests.
[01:14:16.480 --> 01:14:23.840] I switch between two types of questions, so I'm going to ask you this one: What's something you love about the other person?
[01:14:24.160 --> 01:14:32.640] I love that Oji lives in the future, and as obviously, I get the benefit of this.
[01:14:33.280 --> 01:14:36.160] We get to plan.
[01:14:36.400 --> 01:14:51.360] He's very intentional about the future, and it just makes us move together with intention for as partners in life, as well as even with our company.
[01:14:51.360 --> 01:14:55.280] I think that is something I really appreciate about him.
[01:14:55.840 --> 01:14:57.360] So sweet.
[01:14:59.600 --> 01:15:02.080] So, there are two things I love about Izina.
[01:15:02.080 --> 01:15:07.200] One is she is, I think she's a very accepting, loving person.
[01:15:07.200 --> 01:15:07.760] She's nice.
[01:15:07.760 --> 01:15:14.720] Like, when I introduce her, I say, Look, you're gonna talk to her and you're gonna forget about me because she's 10 times nicer than I am.
[01:15:15.280 --> 01:15:18.720] I'm not a horrible person, but she's just way nicer than I am.
[01:15:18.720 --> 01:15:20.320] And so, that's that's great.
[01:15:20.320 --> 01:15:26.720] The second thing is something that people don't know about her is that she's a great problem solver, right?
[01:15:26.720 --> 01:15:34.120] Like, she will, you'll throw something at her and the asymmetric thinking is incredible, right?
[01:15:29.680 --> 01:15:36.520] I didn't think of that.
[01:15:36.840 --> 01:15:38.200] It happens to me all the time.
[01:15:38.200 --> 01:15:39.720] So I love that too.
[01:15:40.040 --> 01:15:43.000] Ezini and Aji, this was amazing.
[01:15:43.000 --> 01:15:48.840] We covered so much ground, so many lessons learned over these past 50 years about life and work and PMing and all the things.
[01:15:48.840 --> 01:15:49.960] Two final questions.
[01:15:49.960 --> 01:15:51.960] Where can folks find you online if they want to reach out?
[01:15:51.960 --> 01:15:54.360] And how can listeners be useful to you?
[01:15:54.360 --> 01:15:58.520] They can find us online at www.productmind.co.
[01:15:58.520 --> 01:15:59.480] That's first.
[01:15:59.480 --> 01:16:05.000] That's where we help founders and help companies build better products.
[01:16:05.160 --> 01:16:10.600] They can find us on Substack where we publish as ProductMind.
[01:16:12.120 --> 01:16:13.560] I think those are the main places.
[01:16:13.800 --> 01:16:21.960] We are starting to put content out in YouTube or, you know, just experimenting with how to amplify our voice.
[01:16:21.960 --> 01:16:24.120] But those are the two main places you can find us.
[01:16:24.120 --> 01:16:24.520] Awesome.
[01:16:24.520 --> 01:16:26.040] And we'll link to all that stuff in the show notes.
[01:16:26.040 --> 01:16:27.080] And then, yeah, Ezini.
[01:16:27.240 --> 01:16:28.200] No, same thing.
[01:16:28.200 --> 01:16:31.800] Like our LinkedIn, we always say my DMs are always open.
[01:16:31.800 --> 01:16:34.280] That's what I always tell people at work.
[01:16:34.920 --> 01:16:36.120] You can reach out to us.
[01:16:36.360 --> 01:16:39.000] We're pretty responsive to most people.
[01:16:39.480 --> 01:16:43.800] But yeah, productmind.co is the best way to get a hold of us.
[01:16:43.800 --> 01:16:49.320] How can we be useful to us is like we are so inspired by what you've done with your community.
[01:16:49.640 --> 01:17:01.560] And we're part of a few communities, but we are very interested in partnering with more PMs, especially executive cohort, like who are having a hard time right now.
[01:17:01.560 --> 01:17:13.880] Like partnering with us, calling us, like we want to be able to distribute opportunity that we may not be interested in to PMs who are struggling right now.
[01:17:13.880 --> 01:18:19.280] And sometimes it's not obvious where to pass opportunity that we're not interested in so like i think useful to us is reach out um and there might be intersections that help you or help us so do it amazing and that's through dms on linkedin ideally or through your website yes through our website or through substack or just linked okay all the places yeah incredible uh thank you two so much for being here such a pleasure thank you lenny um we just like what you're doing for our community is incredible so uh hats off i appreciate it hats off good to see the progress over over the years so yeah well done thank you thank you you too okay bye everyone thank you so much for listening if you found this valuable you can subscribe to the show on apple podcasts spotify or your favorite podcast app also please consider giving us a rating or leaving a review as that really helps other listeners find the podcast you can find all past episodes or learn more about the show at lennyspodcast.com see you in the next episode