
Episode 792 | Hot Take Tuesday: GPT-5 Struggles, the A.I. Bubble, and the Windsurf Debacle
September 2, 2025
Key Takeaways
- TinySeed’s first fund has returned more capital to investors than was initially invested, placing it in the top 10% or 5% of funds from its vintage, which is a significant achievement in venture capital given the long timelines for fund returns.
- The perceived decline in GPT-5’s performance compared to GPT-4 suggests that AI model improvement may be reaching an asymptote, with smaller incremental gains requiring exponentially more compute, challenging the notion of rapid, unchecked AI advancement.
- The Windsurf situation highlights a potential breach of Silicon Valley’s social contract, where early employees received minimal value from a significant company arrangement while founders and investors benefited substantially, raising concerns about equity and employee compensation in startup exits.
Segments
GPT-5 Performance Debate (00:09:40)
- Key Takeaway: The perceived limitations and mixed reception of GPT-5 suggest that AI model improvements may be reaching an asymptote, with smaller incremental gains requiring exponentially more compute, challenging the ‘foom’ (fear of a singularity) narrative.
- Summary: The conversation delves into the user experience and perceived performance of GPT-5, with guests noting that while it’s an improvement, the gains are not as dramatic as some might expect. They discuss the debate around whether GPT-5 is better than GPT-4, the user interface changes, and the underlying economics of AI model development, suggesting a potential slowdown in exponential capability growth.
AI Bubble and Business Viability (00:19:19)
- Key Takeaway: Many AI-centric startups may be unviable long-term due to the high cost of AI compute, which is often subsidized by LLM providers, creating a potential bubble that could burst as costs rise or free credits diminish.
- Summary: The hosts explore the concept of an AI bubble, focusing on the economic model of AI compute. They discuss how companies like OpenAI and Anthropic are losing money by providing AI services at a cost lower than their production, and how this unsustainable model could impact startups built on these services, especially those offering seemingly low-cost AI-driven features.
Windsurf Employee Compensation (00:24:09)
- Key Takeaway: The Windsurf situation exemplifies a breach of Silicon Valley’s social contract, where early employees received minimal financial benefit from a significant company arrangement while founders and select individuals secured substantial payouts, raising concerns about equity and employee compensation.
- Summary: This segment addresses the controversial Windsurf situation, where early employees reportedly received very little from the company’s acquisition by Google and subsequent sale. The discussion highlights the perceived violation of the traditional Silicon Valley promise of shared upside for early employees, contrasting it with the significant financial gains for founders and investors, and questioning the fairness of such arrangements.
Selling to Non-Users (00:34:42)
- Key Takeaway: For SaaS products where the user is not the buyer (e.g., developers using a tool for a CTO), founders must proactively equip users with the tools and information needed to advocate for the product internally, thereby navigating complex sales cycles.
- Summary: The discussion centers on the challenges of selling B2B SaaS when the end-user is different from the decision-maker. The article and hosts emphasize the importance of building product features and resources that empower the user to champion the product to their leadership, acknowledging that this requires a deeper understanding of the entire sales cycle and strategic thinking beyond just product development.
Debug Information
Processing Details
- VTT File: c1e-vvr3f7kxxgudvkrx-ndzkw2zva3g1-ojs9nt.vtt
- Processing Time: September 10, 2025 at 09:01 AM
- Total Chunks: 1
- Transcript Length: 69,792 characters
- Caption Count: 582 captions
Prompts Used
Prompt 1: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 1 of 1 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
[00:00:00.160 --> 00:00:02.800] You've stumbled upon another episode of Startups for the Rest of Us.
[00:00:02.800 --> 00:00:16.800] I'm Rob Walling, and in this Hot Take Tuesday, I welcome back Enar Volcet and Tracy Osborne as we talk about chat GPT-5 and the struggles they've had with that release, whether we're in an AI bubble, and a few other relevant news items.
[00:00:16.800 --> 00:00:19.600] One of them actually deeply relevant to SaaS sales.
[00:00:19.600 --> 00:00:25.760] Before we dive into the episode, I wanted to let you know that TinySeed applications are open for the fall 2025 batch.
[00:00:25.760 --> 00:00:37.040] If you're a B2B SaaS founder with at least $1,000 in MRR and you're looking for the right amount of funding, a community of ambitious, like-minded founders, and a network of world-class mentors, you should apply.
[00:00:37.040 --> 00:00:45.200] If you have any questions about the program or the application process, we're also doing a live QA with the Tiny Seed team this Wednesday, September 3rd, on YouTube.
[00:00:45.200 --> 00:00:46.960] We'll link that up in the show notes.
[00:00:46.960 --> 00:00:51.360] If you know your metrics, the application only takes about 10 or 15 minutes to complete.
[00:00:51.360 --> 00:00:53.840] Applications close on September 9th.
[00:00:53.840 --> 00:00:57.760] You can get all the details at tinyseed.com/slash apply.
[00:00:57.760 --> 00:00:59.520] All right, let's get into it.
[00:01:06.560 --> 00:01:10.320] Welcome back to Hot Take Tuesday.
[00:01:12.560 --> 00:01:17.280] I have two guests, my recurring panel, and this is by popular demand.
[00:01:17.280 --> 00:01:23.440] And by that, I mean two people have mentioned it to me in the past six weeks saying, why don't you do Hot Take Tuesday anymore?
[00:01:23.440 --> 00:01:24.880] And I was like, I don't know.
[00:01:24.880 --> 00:01:25.840] People have been traveling.
[00:01:25.840 --> 00:01:26.640] It's been summer.
[00:01:26.640 --> 00:01:27.760] So here we are.
[00:01:27.760 --> 00:01:34.960] We're going to cover stories ranging from, well, you know, doesn't everyone want to hear us talk about American politics and crypto?
[00:01:34.960 --> 00:01:36.640] No, we're not going to do either of those things.
[00:01:36.640 --> 00:01:38.400] We're going to discuss Kirkhop.
[00:01:38.720 --> 00:01:39.280] He did.
[00:01:39.280 --> 00:01:41.120] He's like, I want to talk about both.
[00:01:41.120 --> 00:01:43.840] GPT-5, is it worse than 4-0?
[00:01:43.840 --> 00:01:48.560] Are we entering an AI bubble with the limits that folks are starting to put on it?
[00:01:48.560 --> 00:01:54.080] Windsurf, and that whole debacle of employees getting hosed and more topics.
[00:01:54.080 --> 00:02:00.440] First, I'd like to introduce my panelists, the head of product for TinySeed and MicroConf.
[00:02:00.440 --> 00:02:03.000] Tracy Osborne.
[00:02:03.960 --> 00:02:05.240] That was awful.
[00:01:59.920 --> 00:02:06.600] That was awful.
[00:02:08.840 --> 00:02:10.520] Do you want to do that again?
[00:02:11.800 --> 00:02:13.320] Welcome to the show.
[00:02:13.320 --> 00:02:18.360] I was going to say, last time I was here, I was wondering if you've realized I have a title change since the last time at this time.
[00:02:18.360 --> 00:02:19.640] It's been that long.
[00:02:20.520 --> 00:02:21.560] It's been a while.
[00:02:21.560 --> 00:02:25.720] You used to be Tracy Makes on Twitter, and that's where we'll leave it.
[00:02:25.720 --> 00:02:27.480] No, wait, you're Trace at Tracy Osbone.
[00:02:27.560 --> 00:02:27.880] Where are you?
[00:02:27.880 --> 00:02:30.040] TracyOsbone.com on this guy.
[00:02:30.200 --> 00:02:30.840] I am.
[00:02:30.840 --> 00:02:31.400] Okay.
[00:02:31.800 --> 00:02:36.920] And we're still counting down the days until you owe AR $1,000 and buy us both sushi dinner.
[00:02:36.920 --> 00:02:37.720] That was in there, right?
[00:02:37.720 --> 00:02:38.680] Wasn't it a sushi dinner?
[00:02:39.160 --> 00:02:43.400] We never determined what all the terms were between Twitter and between.
[00:02:43.480 --> 00:02:48.440] What is three years, and I think it was $100 million.
[00:02:48.840 --> 00:02:50.360] Sorry, that wasn't even active.
[00:02:50.920 --> 00:02:52.440] It was like 50 or 100, and we kind of bounced.
[00:02:52.440 --> 00:02:53.080] I think it's 100%.
[00:02:53.160 --> 00:02:54.760] Elon Musk has a lot to go.
[00:02:54.760 --> 00:02:56.680] So I sort of remain hopeful.
[00:02:57.240 --> 00:03:09.080] Other person chiming in is our very own Annar Volcet, co-founder of Tinyseed, the founder of Discretion Capital, the best sell-side MA firm for SAS, between 2 and 20 million ARR.
[00:03:09.080 --> 00:03:10.920] Ann R Volcet, welcome to the show.
[00:03:10.920 --> 00:03:12.200] Thanks for having me.
[00:03:12.200 --> 00:03:14.040] All right, let's dive in to our first story.
[00:03:14.040 --> 00:03:15.720] This is actually story zero.
[00:03:15.720 --> 00:03:23.480] And we've been talking about this for the past couple weeks and getting a lot of positive responses as one would expect on X Twitter and the like.
[00:03:23.480 --> 00:03:35.080] TinySeed, our startup accelerator for ambitious B2B SaaS founders, has returned fund one, meaning, we have returned more capital to our investors, RLPs, than they invested.
[00:03:35.080 --> 00:03:36.440] This is a six-year-old fund.
[00:03:36.440 --> 00:03:46.400] And ANR, I believe, with other benchmarks or with the benchmarks across venture capital funds, we are in the top 10%, potentially top 5% of funds of that vintage.
[00:03:46.400 --> 00:03:49.600] So, you want to talk, you know, why is this a big deal?
[00:03:44.840 --> 00:03:52.080] Because I've never run a fund before, and I was like, oh, that's neat.
[00:03:52.240 --> 00:03:53.520] And then I'm like, no, that's not neat.
[00:03:53.520 --> 00:03:54.880] Like, this is a big deal.
[00:03:54.880 --> 00:03:55.840] Why is that?
[00:03:55.840 --> 00:03:56.640] It's a big deal.
[00:03:56.640 --> 00:04:01.680] I mean, I think it puts us in the top 10, if not top 95% of that vintage.
[00:04:01.680 --> 00:04:04.480] And it's not just like, oh, that was a really crappy vintage.
[00:04:04.480 --> 00:04:14.640] The fact of the matter is that, you know, venture funds have gotten to the point where, like, you know, traditionally, partly because of how, you know, tech funds, tech companies stay private for longer, that sort of thing.
[00:04:14.640 --> 00:04:23.280] Like, you know, they stay, they get longer and longer and longer before they return, return any funding and certainly before they start returning the fund and multiples on that.
[00:04:23.280 --> 00:04:32.560] So actually, I just saw there was a, there was a stat out about CARDA where it says like, you know, basically on, on, and DPI is the term, distributions to paid in.
[00:04:32.560 --> 00:04:37.680] And basically says, like, you know, over 12 years in, only about half the funds have returned 1x DPI.
[00:04:37.680 --> 00:04:47.360] So what that means is if you, if you invested in all the venture funds out there, you know, and you held it for 12 years, only about half those funds would even have returned your capital.
[00:04:47.360 --> 00:04:49.280] So yeah, it's a big deal.
[00:04:49.280 --> 00:04:56.720] And also, I think crucially for us, it's not just like, you know, you can return the fund and then have nothing left, in which case, like, yeah, that wasn't so great.
[00:04:56.720 --> 00:05:03.520] But the fact is, like, for us, for fund one, you know, we still have, you know, the majority of the assets are still growing.
[00:05:03.520 --> 00:05:06.880] And sort of it's a very good sign very early on.
[00:05:06.880 --> 00:05:14.800] Like, VC is kind of funny in the sense that like one career where like people don't really know and you don't really know if you're any good until 10 years in.
[00:05:15.760 --> 00:05:18.880] So it's a good place for charlatans and fast talkers.
[00:05:18.880 --> 00:05:21.120] So we'll see how, we'll see if that includes us.
[00:05:21.120 --> 00:05:22.880] But yeah, it's a very good sign.
[00:05:22.880 --> 00:05:24.880] And yeah, I feel very good about it.
[00:05:24.880 --> 00:05:29.360] And also, crucially, like, look, we return, you know, the money of the people who trusted us the first time around.
[00:05:29.360 --> 00:05:30.440] Like, this was fund one.
[00:05:30.760 --> 00:05:32.760] Like, it was an unproven thing.
[00:05:29.840 --> 00:05:36.840] You know, a lot of people call this kind of like this is really stupid, like, it's not going to work out.
[00:05:37.080 --> 00:05:39.240] Like, you're never even going to get your money back.
[00:05:39.240 --> 00:05:48.760] And at this point, we proven wrong, and we can, you know, tell those people to go themselves and give the people more than the money that they take back.
[00:05:48.760 --> 00:05:51.800] They've already gotten a return on their capital with more to come.
[00:05:51.800 --> 00:05:52.920] So that feels good.
[00:05:52.920 --> 00:05:54.280] Yeah, really exciting.
[00:05:54.280 --> 00:05:55.240] Tracy?
[00:05:55.560 --> 00:06:06.840] The thing I didn't realize before I started working here, I, you know, because before I was a startup founder, I tried, I raised some money from VCs, but I didn't really quite understand how VC economics worked.
[00:06:06.840 --> 00:06:18.440] And the return of the fund is super exciting because compared to startups where you're like, you're doing super well and you can reinvest those profits into the business and continue growing, we have to continually raise new funds.
[00:06:18.440 --> 00:06:39.960] So if we have this massive success on one of our, on our first fund that proves out our thesis and proves that what we're trying to do is working and is working really well, that means we can raise more funds and we need to raise more funds in order to exist because we can't rein like all the returns and whatnot, it's not coming back the tiny seed for us to invest in other people.
[00:06:39.960 --> 00:06:41.800] We have to raise new funds in order to do that.
[00:06:41.800 --> 00:06:47.320] And then we live off those, you know, run the company essentially off those management fees.
[00:06:47.320 --> 00:06:54.680] So on one hand, it's like it's super awesome to return that fund, but it's also for me, it's like, yay, I get to have a job for longer because of this, you know?
[00:06:55.320 --> 00:06:56.440] I like that part.
[00:06:56.760 --> 00:06:58.120] It's also, it's also pretty strange.
[00:06:58.200 --> 00:07:01.160] It's like, look, like, like, investing is like this.
[00:07:01.160 --> 00:07:08.120] If you, if you, you take people's money and you're basically saying, I'm better at, I know more, better than you what to do with your money.
[00:07:08.120 --> 00:07:13.080] And like, it's a better, you know, you should give your money to me instead of doing it, doing whatever you want to do with it yourself.
[00:07:13.080 --> 00:07:20.560] And like being able to prove that out with actual cash in the bank instead of like arguing about markups and paper returns and all this stuff feels really good.
[00:07:20.880 --> 00:07:22.880] That's the thing because we came out with a thesis.
[00:07:22.880 --> 00:07:31.440] A, that bootstrap B2B SaaS founders would even take money because there was a whole conversation around this in 2017, 18 as I started talking about this at MicroConfin.
[00:07:31.520 --> 00:07:35.120] There was a real pushback from a small group, but like, why would they take your money?
[00:07:35.120 --> 00:07:35.920] Why would they give up equity?
[00:07:35.920 --> 00:07:36.720] They should just bootstrap.
[00:07:36.720 --> 00:07:37.760] So there was that thesis.
[00:07:37.760 --> 00:07:41.920] Then there's a thesis of does this return even close to the SP 500, right?
[00:07:41.920 --> 00:07:49.040] Because if it doesn't return that over, you know, what, 10 years or whatever the horizon is, no one's going to give us more money and it's not viable.
[00:07:49.040 --> 00:07:50.640] And so those were our theses.
[00:07:50.640 --> 00:07:55.360] And obviously the first one has been proven because we funded 204 companies in the past six years.
[00:07:55.360 --> 00:07:57.520] And the second one, the signs are good.
[00:07:57.520 --> 00:08:01.680] We've returned the fund and as you said, more than the initial investment.
[00:08:01.680 --> 00:08:05.120] So folks are already getting a return and there's still a lot of upside.
[00:08:05.120 --> 00:08:10.240] With that in mind, fund three, tiny seed fund three is closing soon.
[00:08:10.240 --> 00:08:11.600] We have a little bit more room.
[00:08:11.600 --> 00:08:25.760] And if you are an accredited investor anywhere in the world and would like to put some money to work in hand-picked, high-quality, I would say some of the best deal flow in B2B SaaS in the world, head to tinyc.com/slash invest.
[00:08:25.760 --> 00:08:26.880] And you can read our thesis.
[00:08:26.880 --> 00:08:30.320] And if you fill out that form there, it goes directly to Einar's inbox.
[00:08:30.320 --> 00:08:31.520] And he will reach out.
[00:08:31.520 --> 00:08:38.480] Before we move on from this topic, Ainar, the last post I put on XTwitter got a question from Daniel Westendorf.
[00:08:38.480 --> 00:08:41.200] And he asked, are you able to share the distribution of returns?
[00:08:41.200 --> 00:08:45.280] Like, was the majority from a few portfolio exits or something else?
[00:08:45.280 --> 00:08:50.480] And obviously, we don't give exact numbers because we don't need to because we're a private fund, but what's your sentiment around this?
[00:08:50.800 --> 00:08:52.720] Yeah, the short answer is yes.
[00:08:53.040 --> 00:08:58.480] It's exits or partial exits from one or two or a small number of the portfolio.
[00:08:58.480 --> 00:09:06.920] And that's sort of like very much in line with the thesis that we wrote ahead of Actually Fund 2, which is like, look, this is still a power law type environment.
[00:09:07.160 --> 00:09:12.360] It's not like, you know, all the companies are going to grow uniformly at the same rate.
[00:09:12.360 --> 00:09:18.760] There's still going to be some companies that outperform to the point where like, yeah, it's obvious that this is web drives performance.
[00:09:18.760 --> 00:09:21.800] And that's sort of been proven out for us as well.
[00:09:22.120 --> 00:09:26.440] And if anyone's curious about what this all means, we do have this on our website.
[00:09:26.440 --> 00:09:28.680] It's like tinyc.com/slash thesis.
[00:09:28.680 --> 00:09:32.680] So if you are a data nerd, definitely check that out because I think it's really interesting.
[00:09:32.680 --> 00:09:40.440] Again, coming from someone coming from the startup world, coming into VC, seeing how these things kind of work behind the scenes, check out our thesis if you want to read more.
[00:09:40.440 --> 00:09:49.560] All right, moving on to our next story, something that's on a lot of folks' minds right now is GPT-5 came out, what, two weeks ago?
[00:09:49.560 --> 00:09:50.840] And there is a lot of talk.
[00:09:50.840 --> 00:09:59.880] I mean, we have a Y Combinator thread or a Hacker News thread that we'll put in the show notes, but you can go almost anywhere on the internet and just search for is GPT-5 worse than 4.0.
[00:09:59.880 --> 00:10:02.840] And it does seem like there's some a little bit of friction online.
[00:10:02.840 --> 00:10:10.600] Tracy, do you have thoughts you want to kick us off on whether you've, you know, have you have first-hand experience with it or what's your sentiment around what's going on?
[00:10:10.600 --> 00:10:15.560] I think it's, I'm definitely not a power user, but it was really interesting yesterday.
[00:10:15.560 --> 00:10:21.400] Well, I guess in the last week, because 5 launched and you couldn't choose what model it was using.
[00:10:21.400 --> 00:10:29.000] It was choosing in the back end, you know, whether it's going to be using the slower, more expensive, whatever, robot to answer your questions or the fast, quick one.
[00:10:29.000 --> 00:10:32.120] And then just yesterday, I was using it again, and they added those two things.
[00:10:32.120 --> 00:10:39.400] I was divorced from the discussion on the internet about which one's better or the other, but I was like, oh, thank goodness that I finally choose.
[00:10:39.400 --> 00:10:40.360] I don't want it to.
[00:10:40.360 --> 00:10:43.160] For me, I didn't want it to choose, and I wanted to pick.
[00:10:43.160 --> 00:10:49.120] And then I found this whole wash of information of people screaming about different models and like which one was the better one.
[00:10:49.200 --> 00:10:57.920] And then people, it's understandable if you're not able to choose which one, whether you want it to be fast or slow, or using the expensive cycles or not.
[00:10:57.920 --> 00:11:04.240] Because if you're on a paid account, you know, there's this expectation of having the most expensive version.
[00:11:04.240 --> 00:11:11.920] So I can kind of see why they like monkey patch this back in, this like the thinking, I guess they call it thinking or fast or whatever the other two are.
[00:11:11.920 --> 00:11:15.920] Anyway, that's my experience with just kind of seeing this in the last week, being like, oh, goodness.
[00:11:15.920 --> 00:11:19.680] And then finding this mic on their front and be like, whoa, it's huge here.
[00:11:19.680 --> 00:11:21.680] I am sure Anar has things to say though.
[00:11:21.680 --> 00:11:24.080] So I think I should pass it over to him.
[00:11:24.080 --> 00:11:25.920] Yeah, I mean, yeah, I have opinions.
[00:11:26.240 --> 00:11:28.000] There's a couple of different things here, right?
[00:11:28.000 --> 00:11:35.280] Like, you know, is ChatGPT 5 better than 04 or 4.0 or whatever, however, whatever the name is?
[00:11:35.440 --> 00:11:36.400] I mean, I think it is.
[00:11:36.800 --> 00:11:37.920] I use it a fair bit.
[00:11:37.920 --> 00:11:42.800] And I think there's no doubt in my mind that, you know, GPT-5 is very good.
[00:11:42.800 --> 00:11:58.480] I think one of the more vindicating things for me is that I've been saying for a while, I actually like in the actually in the talk that I was supposed to give at Microconfinola that Tracy you so expertly gave, one of the points I make is like, look, like, and it was, this was about like, will AI kill SAS?
[00:11:59.120 --> 00:12:03.440] Which has been a big question for the last couple of years since ChatGPT came out.
[00:12:03.440 --> 00:12:27.960] It's like, look, like, everyone's just going to like, it's going to come so smart now so soon that like you're never even going to like need any sas like you just like talk to it or like it's just super intelligence and it's like that there's all this stuff and and my point has been: if you look at the data of how it's how quickly it's improving and like how much it's improving per step, it seems to me to be doing the opposite of what all these ai doomers are talking about.
[00:12:27.960 --> 00:14:00.560] Like, all the ai doomers are talking about, like, oh, you know, it's sort of this sort of idea of foom or like it's sort of exponential takeoff and like it'll soon be like so smart that you know we'll be like ants to it and i'm like well yeah okay but that's not what the data shows the data shows which it's it's really reaching looks to be reaching some kind of asymptote like this asymptote could be very high but it still to me seems like each new model that comes out whether it's from chat gpt or google or or or grok or whatever it seems to be an improvement but it's a smaller improvement smaller step up than the last one and i think you know chat gpt5 just sort of confirms that that view in me and that look like this is something that yeah it's it's an ex an extremely disruptive thing to technology there's a lot of stuff here that that like is disruptive it's it's different it's it's it's capable in a way that you know it's obviously a very new very capable technology but like do i think we're going to be ants to it in two years do i think like we need to bomb data centers because otherwise like humanity has ended because of chat gpt7 like no i mean that's bullshit it always was bullshit like if you looked at the data this was crap honestly when it came out i looked at it i was like yeah there's some really smart people that that basically told me i was an idiot when i pointed this out and and and there are just some fundamental scaling laws underneath this the capability growth that just means that like as you make a like another step in capability like the the amount of compute compute that you need to to add to make it smarter as it were just grows exponentially.
[00:14:00.560 --> 00:14:06.160] So you get like a linear increase in capability with an exponential growth in compute and that just that just doesn't work.
[00:14:06.160 --> 00:14:10.800] Like, you know, like you need a data center the size of the moon to get to the next step just by scaling.
[00:14:10.800 --> 00:14:15.200] And like beyond that, it's like, ah, now it's a data center the size of you know the the galaxy.
[00:14:15.200 --> 00:14:18.560] Like, come on, like, this is, it just doesn't work that way.
[00:14:18.560 --> 00:14:31.600] And so, like, for our listeners, like, to a degree, it must be sort of relieving to see that, you know, GPT-5 isn't superhuman, isn't going to reproduce any SaaS at any time with a flick of a button for 20 bucks a month.
[00:14:31.600 --> 00:14:36.560] And so it's more like, okay, this is an extremely capable technology that, yeah, you need to stay on top of.
[00:14:36.560 --> 00:14:38.960] Like, you need to incorporate it into your SaaS business.
[00:14:38.960 --> 00:14:42.400] You need to incorporate it into your business, or you will be left behind.
[00:14:42.400 --> 00:14:44.240] But you don't need to become a plumber.
[00:14:44.240 --> 00:14:49.040] I mean, nothing wrong with plumbers, but like, it's not like technology as a job is over.
[00:14:49.040 --> 00:14:50.480] So, so, yeah, that's what I think.
[00:14:50.800 --> 00:15:04.880] When Enar had talked about the asymptote, whenever it was, you mentioned that a year ago or six months or something, it was that resonated with me already because the first GPT that I used, the first chat GPT that I used, feels pretty similar to the one that I use today.
[00:15:05.200 --> 00:15:10.240] You know, it's a little better, it's obviously, but like it's improving in a small way.
[00:15:10.240 --> 00:15:16.560] And I broke my own personal rule, which is never read the hacker news comments because this is just a, there's no article here.
[00:15:16.560 --> 00:15:20.640] It's just a chat GPT is slow and no better than four, and it asks a question, right?
[00:15:20.640 --> 00:15:26.880] But I did like the, you know, the first comment says GPT-5 is better as a conversational thinking partner than 4.0.
[00:15:26.880 --> 00:15:30.560] Its answers are more focused, concise, and informative, and it goes on.
[00:15:30.560 --> 00:15:33.040] And probably, you know, that's probably accurate.
[00:15:33.040 --> 00:15:41.920] The thing that it reminds me of is, you know, we are thinking like chat GPT, the new model, is going to be able to do everything better than all the old models.
[00:15:41.920 --> 00:15:44.400] And that's the difference is I think we're getting to this point.
[00:15:44.400 --> 00:15:49.280] I'm probably not the first person to say this, but where there's going to start being GPTs just for this.
[00:15:49.280 --> 00:15:51.360] Like if it's like a brainstorming GPT, right?
[00:15:51.360 --> 00:15:57.920] And it's optimized for that versus like long form copy or short form copy or conversation or coding or, right?
[00:15:57.920 --> 00:15:59.880] There's all these uses people are using it for.
[00:15:59.880 --> 00:16:03.960] And if it was AGI, then maybe it could do all these things, but it's not.
[00:15:59.520 --> 00:16:06.440] And I don't see a near future where that is.
[00:16:06.600 --> 00:16:14.680] So I think the specialization of AI is going to be, of these models already exists, obviously, but I don't know that most people think about it that way.
[00:16:14.680 --> 00:16:16.360] And they just think new model better on everything.
[00:16:16.360 --> 00:16:18.120] And I don't think that's going to happen.
[00:16:18.120 --> 00:16:19.160] I think that's definitely true.
[00:16:19.160 --> 00:16:25.960] I mean, like, it's to the point where like when GPT-5 came out, so I pay for the pro level 200 bucks a month, whatever.
[00:16:25.960 --> 00:16:31.560] And when it came out, I was genuinely a little sad that I couldn't talk to 4.5.
[00:16:31.800 --> 00:16:35.720] That's still my, like, it's almost like it almost has, it has personality, some of it.
[00:16:35.720 --> 00:16:37.160] And I really like 4.5.
[00:16:37.160 --> 00:16:41.160] And so when they discontinued 4.5, I was like, this is a bit right here.
[00:16:41.160 --> 00:16:45.960] Because like, yeah, in some ways, 5 is more capable in some of the tasks that I have it running on.
[00:16:45.960 --> 00:16:49.080] But 4.5 was just a better conversation list.
[00:16:49.080 --> 00:16:49.720] It just was.
[00:16:49.720 --> 00:16:52.440] And so when it got it back, I was like, yes, thank you.
[00:16:53.000 --> 00:16:54.360] I'm extremely happy to have.
[00:16:54.440 --> 00:16:55.000] And that's true.
[00:16:55.720 --> 00:16:57.240] I was a little sad when it disappeared.
[00:16:57.240 --> 00:16:58.360] And now I'm like, yeah, it's back.
[00:16:58.360 --> 00:16:59.560] It's amazing.
[00:16:59.560 --> 00:17:22.120] And then they also, I want to say 4.5 also had, there's some sort of amount of, I don't know, the right word for this is neutering, but it felt like it was kind of pulled back a little bit that they've, in five, now it's, it's, there's some topics it won't talk about, but I believe also at five that it's, it's, hasn't been prevented from talking about certain topics as much as it was in 4.5.
[00:17:22.120 --> 00:17:23.800] Have you run into that, Einar?
[00:17:24.600 --> 00:17:29.000] I thought 4.5 was pretty, actually one of the interesting things is they said, what was it?
[00:17:29.000 --> 00:17:35.080] Oh yeah, only pro users, like to the $200 a month, are going to be able to use 4.5 from PowerN.
[00:17:35.080 --> 00:17:38.280] And because this apparently is an extremely expensive model.
[00:17:38.280 --> 00:17:41.720] And the nice thing about 4.5 is it doesn't have a thinking piece.
[00:17:41.720 --> 00:17:44.960] And it's like, when you're doing back and forth, I don't want you to be thinking.
[00:17:44.440 --> 00:17:48.960] Like, I don't want you to spend like a minute and a half thinking about the response.
[00:17:44.680 --> 00:17:50.240] I need you just to respond.
[00:17:50.720 --> 00:18:00.320] And to me, that seems to be like, it almost feels like 4.5 is like the, which I guess 5 is too, like the 5 fast, maybe, although that does seem a little brain dead compared to 4.5.
[00:18:00.320 --> 00:18:09.440] It's like, 4.5 seemed to me like this is where the current paradigm is able to go with pure scaling without those kind of thinking hacks on top of it.
[00:18:09.440 --> 00:18:11.040] That's what it feels like to me.
[00:18:11.840 --> 00:18:26.480] There was speculation on X Twitter about, maybe it wasn't speculation, maybe it was backed up with data or whatever, but it was kind of like five is less good, but it is because it's cheaper and OpenAI is thinking about cost, you know, and there were folks kind of piping in with that.
[00:18:26.480 --> 00:18:29.760] So maybe that's maybe that's part of it too.
[00:18:31.040 --> 00:18:34.480] Do you wish you had a technical co-founder to help bring your idea to market?
[00:18:34.480 --> 00:18:40.160] Gearheart specializes in helping early stage founders build products users actually want.
[00:18:40.160 --> 00:18:51.440] Their AI-powered approach tests assumptions, validates concepts with real users, and builds scalable MVPs with best practices that won't collapse when your first customers arrive.
[00:18:51.440 --> 00:18:56.800] Founded by entrepreneurs who've launched and exited their own startups, they understand the journey you're on.
[00:18:56.800 --> 00:19:05.280] For the last 13 years, they've helped launch over 70 B2B SaaS platforms, including SmartSuite, which raised $38 million.
[00:19:05.600 --> 00:19:15.280] Book your free strategy session at gearheart.io and mention this podcast to get 20% off validation, discovery, or prototyping for your early stage venture.
[00:19:15.280 --> 00:19:18.000] That's gearheart.io.
[00:19:19.280 --> 00:19:25.040] Our next story is kind of, I'm going to frame it as like, I don't know, are we in an AI bubble?
[00:19:25.040 --> 00:19:27.280] I think the answer is obviously yes, with the amount of money flowing in.
[00:19:27.280 --> 00:19:29.440] But the idea is, you know, what's it going to look like as it pops?
[00:19:29.440 --> 00:19:38.840] And we're going to have, I have a Reddit thread here, but this is another one where AI compute is essentially being given away for far less than it takes to produce, right?
[00:19:38.840 --> 00:19:49.640] OpenAI and Anthropic and all of these are losing hundreds of millions, billions of dollars, whatever it is, because they are buying something for a dollar and giving it away for 20 cents.
[00:19:49.640 --> 00:19:51.720] So there's a couple ways to think about this.
[00:19:51.720 --> 00:19:57.880] I mean, I know people got, developers got pretty angry online when suddenly it was like, wait, they're limiting my, I don't even remember what service it was.
[00:19:57.880 --> 00:19:58.920] Was it Cloud Code or something?
[00:19:58.920 --> 00:20:00.440] It was like, now I have limits on this thing.
[00:20:00.440 --> 00:20:05.560] And it's like, well, of course, because you're probably chewing up $500 worth of AI and you're paying 20 bucks a month.
[00:20:05.560 --> 00:20:06.840] Like, this doesn't make any sense.
[00:20:06.840 --> 00:20:08.600] So, you know, that's kind of a first topic.
[00:20:08.600 --> 00:20:24.680] But the other one is it reminds me, there are a lot of rappers and a lot of SaaS apps that I think where AI is at their core, where all they're doing is taking AI and taking these virtually free, almost free compute credits and doing things with them that if they were actually charged with a cost, it would be completely ridiculous.
[00:20:24.680 --> 00:20:26.920] Like, you know, creating headshots for me, right?
[00:20:26.920 --> 00:20:33.000] Or creating thumbnails or creating internal building or outside vision, any of these small image things or the conversion things.
[00:20:33.000 --> 00:20:39.560] I think if you actually had to charge what that really cost, these are unviable businesses, is my hypothesis.
[00:20:39.560 --> 00:20:42.040] And that's what I want to throw out to the panel: if you have thoughts on that.
[00:20:42.040 --> 00:20:42.920] Let's start with you, Anar.
[00:20:42.920 --> 00:20:44.120] What are your thoughts here?
[00:20:44.120 --> 00:20:45.240] I think it's hard to tell.
[00:20:45.240 --> 00:21:05.080] It's one of those things where I think, like, actually, like, okay, yeah, it's expensive now, but again, I think like a good enough capability is probably achievable on like a self-hosted model for a lot of these businesses in certain times, certainly with like, you know, fine-tuning, and like, as you talked about, like, you get different model specializations.
[00:21:05.080 --> 00:21:12.160] So, so, yeah, maybe these startups can't, they can't just do whatever, just do API calls to the latest model to ChatGPT.
[00:21:12.160 --> 00:21:16.960] And, like, you know, as they crank up the cost there, you know, maybe that becomes unsustainable.
[00:21:17.280 --> 00:21:46.720] But, but I also think, like, okay, if it becomes so expensive that, you know, your model doesn't work, well, then, then, given the specialization you're in, and particularly if it's something like a very specialized vertical SaaS thing that you're doing, okay, you can probably get as good, if not good enough, models by getting your own model, open source model, fine-tuning that, really honing it towards your use case, which, you know, honestly, to be honest with you, like some of these companies probably need to be doing anyway or thinking about it.
[00:21:46.720 --> 00:21:54.960] Once you get to scale and you're a certain size, like, okay, really, you just want to be like, if this, then that, plus API calls to open AI, okay.
[00:21:55.280 --> 00:21:57.600] Well, then you are like, you know, they have you buy the balls.
[00:21:57.600 --> 00:22:08.160] And versus like, if I had a 20 million ARR business that was doing that right now, I'd be like, hmm, yeah, let's spend some money and some time figuring out like, can we have our own open source model?
[00:22:08.160 --> 00:22:13.200] Like, what does it look like to fine-tune that model so that we can host it, so that we can really control the whole stack?
[00:22:13.200 --> 00:22:14.480] Is what I would say.
[00:22:14.480 --> 00:22:16.000] Yeah, there definitely is.
[00:22:16.560 --> 00:22:17.040] It's interesting.
[00:22:17.040 --> 00:22:29.840] I'm going to pass it to you in a second, Tracy, but that makes me think that these underlying AI providers, the LLM API providers, I imagine that they, you know, what happens when they do GPT-cheap?
[00:22:29.840 --> 00:22:34.400] And we know that it's like the 1.0 or 2.0 version, but it's a fifth or a tenth of the price.
[00:22:34.400 --> 00:22:41.520] And to your point, Anar, it's good enough to do a lot of stuff because it was good enough 18, 20, you know, 24 months ago, but it's like way cheaper, right?
[00:22:41.520 --> 00:22:49.520] So that could be a thing of the future where we haven't really gotten to the point of really thinking about cost because I paid 20 bucks a month for chat GPT and I don't even think about it.
[00:22:49.520 --> 00:22:51.920] And if they charge me 50, I would still pay it.
[00:22:52.080 --> 00:22:54.560] But, you know, it's, I don't know, the cost is out of line.
[00:22:54.560 --> 00:22:56.800] I don't think it's anything that a lot of us are thinking about.
[00:22:56.800 --> 00:22:58.880] I don't think about the cost of API these days.
[00:22:59.040 --> 00:22:59.680] How about you, Tracy?
[00:22:59.680 --> 00:23:00.680] What are your thoughts here?
[00:23:01.240 --> 00:23:09.480] This is the place where if you're a founder building off of building tools and services off of AI, a lot of folks are doing that right now.
[00:23:09.480 --> 00:23:19.560] And the people, like, you know, a couple years from now, if you're thinking now about what are the risks and you're seeing these changes come down the funnel, you're seeing things become more expensive.
[00:23:19.560 --> 00:23:23.480] You're seeing, you know, maybe you're not going to have as much functionality as before.
[00:23:23.480 --> 00:23:25.960] You can't just like build right now and assume it's going to be that way forever.
[00:23:25.960 --> 00:23:35.000] So the people who are thinking about it today are the ones who are more likely to be living, still existing a couple years from now because they already have the expectations that this is going to happen.
[00:23:35.000 --> 00:23:36.280] Like it's going to get more expensive.
[00:23:36.280 --> 00:23:37.240] It's going to get harder to use.
[00:23:37.560 --> 00:23:41.640] You need to start putting the work in today to future-proof yourself.
[00:23:41.640 --> 00:23:48.280] And if you don't, like a couple years from now, it could be like, oh, wait, I built this whole business on the fact that this was really cheap and now it's gone away.
[00:23:48.280 --> 00:23:49.640] And oh, my business is over.
[00:23:49.800 --> 00:23:50.520] It's just so sad.
[00:23:51.080 --> 00:23:52.360] No, so it's like, think today.
[00:23:52.360 --> 00:24:09.400] Changes are happening so fast that pulling out like either, thinking out different ways to differentiate yourself, think out different ways of how do you get that data, how do you provide that data to your customers, what can you do to future-proof yourself is going to determine who's going to be still around in a couple years versus not.
[00:24:09.720 --> 00:24:17.960] Our next story is about Windsurf, which was a venture-backed startup that, do I need to say allegedly, allegedly screwed its employees?
[00:24:17.960 --> 00:24:27.560] But we'll link to a tweet from one of the employees who basically said, I was employee number two and I've worked on AI and code for years.
[00:24:27.560 --> 00:24:35.000] And all of my options, my vested shares, I basically got 1% of that value.
[00:24:35.000 --> 00:24:45.520] And the quick, the quick summary of this, I mean, it's kind of complicated, but OpenAI in July, early July of 2025, almost acquired Windsurf for $3 billion, but the deal fell apart.
[00:24:44.920 --> 00:24:48.480] Google swooped in and Aqua hired.
[00:24:48.720 --> 00:24:52.080] And again, I don't know if I need to say allegedly, like, you know, what are the facts here?
[00:24:52.080 --> 00:24:57.600] But Google struck a $2.4 billion arrangement, a non-exclusive license to Winsurf's technology.
[00:24:57.600 --> 00:25:07.120] They basically gutted the company, non-exclusive license, but they hired the CEO, the co-founder, all the key RD leaders, but no equity or control of the company changed hands.
[00:25:07.120 --> 00:25:14.160] And so then another company, Cognition, jumped in and acquired kind of what's left, which to me feels a bit like a shell of a company.
[00:25:14.160 --> 00:25:17.440] But early employees are like, I got nothing.
[00:25:17.440 --> 00:25:22.560] The investors and the founders, supposedly roughly half of that 2.4 billion.
[00:25:22.560 --> 00:25:25.680] So obviously 1.2 billion went to investors and founders.
[00:25:25.680 --> 00:25:30.560] And there was some money left in the company, but again, it feels kind of gutted.
[00:25:30.560 --> 00:25:40.160] And really what I want to talk about here, it's an interesting situation because it feels like the social contract of Silicon Valley was violated.
[00:25:40.160 --> 00:25:42.400] And we've heard little stories about this over the years.
[00:25:42.400 --> 00:25:49.280] I think TopTal is either an LLC or maybe they raised a safe and they never converted it to equity.
[00:25:49.280 --> 00:25:52.560] And allegedly, the founder just never converted it.
[00:25:52.560 --> 00:25:56.240] So no one actually has equity, even though they invested and is just taking money out of the company.
[00:25:56.240 --> 00:25:57.920] You know, again, allegedly.
[00:25:57.920 --> 00:26:00.800] But in this case, like people did get screwed.
[00:26:00.800 --> 00:26:13.680] And oftentimes, when I read these types of stories, because we talked about on this show on a Hot Tech Tuesday, the founders of one of these betting sites, how they like, they had sold for a billion dollars and they didn't get any money years later.
[00:26:13.680 --> 00:26:15.200] And we talked through like how that actually works.
[00:26:15.200 --> 00:26:17.760] And I mean, usually they don't report the whole story, is what happens.
[00:26:17.760 --> 00:26:25.760] It's like, yeah, they sold all their shares, or like they took secondary out early on, or on and on, or they did, or they raised hundreds of millions and that diluted themselves into nothing.
[00:26:25.760 --> 00:26:28.400] Like, there's usually more to the story than they actually report.
[00:26:28.400 --> 00:26:33.080] But in this one, I feel like the employees got screwed and this is kind of so.
[00:26:33.080 --> 00:26:35.080] Chill, you do you want to kick us off, Enar?
[00:26:35.480 --> 00:26:37.480] What are your thoughts on what happened here?
[00:26:37.560 --> 00:26:38.600] This is a precedent.
[00:26:38.600 --> 00:26:41.160] If I mean, this is dirty pool.
[00:26:41.480 --> 00:26:46.680] I mean, I have to admit, I was in shock when I first read the story.
[00:26:46.680 --> 00:26:49.320] And it's still like, it's got to be.
[00:26:49.720 --> 00:26:51.960] Is it a breach of the social contract of Silicon Valley?
[00:26:52.120 --> 00:26:53.000] Hell yeah.
[00:26:53.000 --> 00:26:54.120] Hell yeah, it is.
[00:26:54.120 --> 00:26:55.400] Like, are you kidding me?
[00:26:55.400 --> 00:26:57.480] Like, it was supposed to be a $3 billion.
[00:26:57.560 --> 00:26:59.000] Imagine if you're the employee, right?
[00:26:59.000 --> 00:27:03.720] You sit around and you're like, oh, crap, we're being bought by OpenAI for $3 billion.
[00:27:03.720 --> 00:27:05.320] Like, I've made it.
[00:27:05.320 --> 00:27:07.400] Like, I'm doing options math.
[00:27:07.400 --> 00:27:09.240] Like, what portion am I buying?
[00:27:09.400 --> 00:27:11.560] Buying the Tahoe place immediately?
[00:27:11.560 --> 00:27:13.080] Or do I wait six months?
[00:27:13.080 --> 00:27:14.760] Like, things are good.
[00:27:15.080 --> 00:27:19.560] And then you come in and you're like, oh, the deal fell apart.
[00:27:19.560 --> 00:27:27.720] And the founders and some people, like, the selected ones, are getting almost as much money or maybe more and going to Google.
[00:27:27.720 --> 00:27:37.320] And I'm behind with the unchosen ones in this company now, where Google owns like a license to what?
[00:27:37.320 --> 00:27:37.880] What?
[00:27:38.200 --> 00:27:39.560] Like, that's not cool.
[00:27:39.560 --> 00:27:57.080] I, you know, like, even if, like, you know, like, there was some, there was some argument that was saying, like, look, they put like a hundred million dollar investment into the company and like they could, this was the equivalent of like, what, the leftovers for the people, and you could have divot that out, and they chose not to, blah, blah, blah.
[00:27:57.160 --> 00:28:01.240] I'm still like, yeah, still, you got fed is what you did.
[00:28:01.240 --> 00:28:02.360] You got fed.
[00:28:02.360 --> 00:28:03.480] It is not good at all.
[00:28:03.480 --> 00:28:04.840] I think it's bad precedent.
[00:28:04.840 --> 00:28:07.880] And I hope this is not something that people start doing.
[00:28:07.880 --> 00:28:09.560] Tracy, what are your thoughts?
[00:28:09.560 --> 00:28:17.520] You know, it's as someone who has been in around Silicon Valley since, you know, for way too long, it's really funny looking, not funny, ha-ha.
[00:28:17.680 --> 00:28:23.440] You know, like back in the day in the early 2000s, and these first employees, the first employees, big things are happening.
[00:28:23.440 --> 00:28:25.920] These first employees are also becoming billionaires and whatnot.
[00:28:25.920 --> 00:28:39.440] But it feels like over time, investors and acquiring companies have figured out, this, I don't know, this might be hearsay, but it feels from this, from my perspective, that there is now a bag of tricks that they have.
[00:28:39.440 --> 00:28:57.680] You know, there's all sorts of little tweaks and little rules and lits and contractual things and things you can put into term sheets and whatnot that make it harder and harder for these first employees to get any sort of winnings from things like this.
[00:28:57.680 --> 00:28:58.480] And it does suck.
[00:28:58.480 --> 00:29:04.800] It does feel like that social contract is broken because people come into Silicon Valley and you're going to join a startup and you're going to get rich.
[00:29:04.800 --> 00:29:08.960] And it feels like it is getting harder and harder for that to actually happen.
[00:29:09.360 --> 00:29:14.880] And the only way, like you said, to join a fan company as an employee, you just make way more money.
[00:29:14.880 --> 00:29:16.560] You don't have to worry about any of this.
[00:29:16.560 --> 00:29:20.080] Or you're like startup hopping and increasing your salary that way.
[00:29:20.080 --> 00:29:28.880] But you can't, it's hard to expect or you can't really expect anymore to have this big payout from your startup that you joined as employee number one.
[00:29:28.880 --> 00:29:31.920] And now the founders are making bank.
[00:29:32.240 --> 00:29:33.840] Well, so I don't disagree with that.
[00:29:33.840 --> 00:29:35.920] I mean, like, to a degree, I don't disagree with it.
[00:29:35.920 --> 00:29:42.320] I think there's always like, look, there's a difference between a company sort of not going all that well, like a soft landing type stuff.
[00:29:42.320 --> 00:29:47.280] And in those types of situations, yeah, quite often it's like the founders get taken care of.
[00:29:47.280 --> 00:30:00.200] Like the founders end up with some sort of a package because their buddies at Google Acqui-hired them and they're now worth, look, they got, you know, not billions, but multiple millions or even 10 million over time because they stayed employee.
[00:30:00.200 --> 00:30:02.760] Versus, like, yeah, in that case, common shareholders are wiped out.
[00:30:02.760 --> 00:30:04.440] And it's just like, look, you got a job.
[00:29:59.920 --> 00:30:04.840] Good luck.
[00:30:05.160 --> 00:30:10.360] I mean, you're probably hearing all the stuff about everyone fetching about Phil's in the Silicon Valley, right?
[00:30:10.360 --> 00:30:11.160] Yeah, yeah, exactly.
[00:30:12.040 --> 00:30:15.640] Everyone, all the early employees are like, no, but like, Phil's was an example.
[00:30:15.800 --> 00:30:16.760] They were not doing well, right?
[00:30:16.760 --> 00:30:23.000] When you looked at their actual numbers, and so they just got bought by PE, and everyone underneath was wiped out.
[00:30:23.560 --> 00:30:50.280] Yeah, and there's numbers of stories, and there's all sorts of things, but it's like to a degree, you know, there are other scenarios too, where like, you know, you know, you see this often with like private equity type buyouts and more acqui-hire type buyers, whereas it's like, actually, actually, the investors are getting screwed, where it's like they come along and they like say, look, we're going to buy this company for 50 or 100 million dollars, but actually, like, we're going to pay $3 million for the equity or 1X on the equity, and the rest is carved out for retention.
[00:30:50.280 --> 00:30:51.720] That happens too.
[00:30:51.720 --> 00:30:56.760] And really, like, what it boils down to is like, it's about what the founders decide to fight for.
[00:30:56.760 --> 00:30:58.040] That's what it's about.
[00:30:58.360 --> 00:30:59.880] And that's just what it is.
[00:30:59.880 --> 00:31:05.480] You know, my sort of view on this is like, yeah, does it reflect badly on the wind source founder?
[00:31:05.480 --> 00:31:06.760] Yeah, I think it does.
[00:31:06.760 --> 00:31:09.080] I think it reflects poorly on Google's part too.
[00:31:09.080 --> 00:31:21.480] And I think it was DeepMind, which is, I don't know if that's a company or a department within Google, but I think anytime employees, one of the great things, look, Silicon Valley has done a lot of good things and a lot of not good things.
[00:31:21.480 --> 00:31:29.320] And I think one of the good things it has done is allowed folks, whether you're a founder or whether you're an early employee, to participate in upside of companies.
[00:31:29.320 --> 00:31:31.720] And there's a lot of places in the world where that doesn't exist.
[00:31:31.720 --> 00:31:32.960] There's a lot of places in the U.S.
[00:31:33.000 --> 00:31:34.360] where that doesn't exist.
[00:31:34.360 --> 00:31:40.600] And I think Silicon Valley has tended in general over time to do a decent job of that.
[00:31:40.600 --> 00:31:42.440] Not perfect, but this is.
[00:31:42.440 --> 00:31:44.960] And I think it's a real problem.
[00:31:45.440 --> 00:31:47.120] I think that's because that's the big difference.
[00:31:47.120 --> 00:31:49.360] Like, this is not like an acquires.
[00:31:44.600 --> 00:31:50.080] Well, it's on paper.
[00:31:50.160 --> 00:31:54.160] It's an Aqua Hire, mostly because of regulations around who can buy what.
[00:31:54.160 --> 00:31:55.200] But it's like.
[00:31:55.520 --> 00:32:01.040] The original deal, which was like a successful, holy crap, that's a big number deal was $3 billion.
[00:32:01.040 --> 00:32:02.720] The Google Aqua Hire is 2.4.
[00:32:02.960 --> 00:32:04.960] That's ballpark's same number.
[00:32:04.960 --> 00:32:11.840] So like, you got screwed as if this was a soft landing for the founders, but it was a massive payday for the founders.
[00:32:11.840 --> 00:32:13.040] That's the big difference.
[00:32:13.040 --> 00:32:13.680] Yep.
[00:32:13.680 --> 00:32:25.200] And I think, you know, there are only, I mean, with the middle class declining in the U.S., which is a major problem, there are only so many opportunities, you know, for quote unquote the American dream.
[00:32:25.200 --> 00:32:29.360] Like, does it still exist the way it did 60 years ago, 70 years ago?
[00:32:29.520 --> 00:32:30.800] I don't think so.
[00:32:30.800 --> 00:32:32.880] And I think that that's a problem.
[00:32:32.880 --> 00:32:34.320] I'm going to disagree on here.
[00:32:34.320 --> 00:32:44.320] I actually think one of the interesting stats about this declining middle class stuff is like, is actually, yeah, the middle class is declining, but that's partly because the upper class is expanding.
[00:32:44.320 --> 00:32:45.120] Is it?
[00:32:45.440 --> 00:32:45.840] Yeah, yeah.
[00:32:46.240 --> 00:32:50.320] So it's like, look, the middle class is declining, but also because people are getting richer.
[00:32:50.720 --> 00:32:54.320] A smaller percentage of people are getting richer while a larger percentage are getting part of it.
[00:32:54.560 --> 00:32:56.960] But I think a larger, yeah, I think people struggle.
[00:32:57.200 --> 00:33:03.040] When I go to get gas or buy avocados, I'm like, how do people, if I was making 20 bucks an hour, how the f would I afford this stuff?
[00:33:03.040 --> 00:33:04.880] It's crazy expensive.
[00:33:04.880 --> 00:33:06.240] Yeah, that's true, yes.
[00:33:06.240 --> 00:33:08.880] But good for those people who are getting rich.
[00:33:09.760 --> 00:33:10.240] Yeah.
[00:33:10.480 --> 00:33:11.120] I mean, you know.
[00:33:11.360 --> 00:33:12.080] Billions of dollars.
[00:33:12.240 --> 00:33:21.280] I mean, one of the craziest things about me lately, like, what are the truly like, holy crap, bananas thing for me is like, is what Zuck's doing with poaching AI researchers.
[00:33:21.280 --> 00:33:22.400] Like, insane.
[00:33:22.400 --> 00:33:23.280] People are getting paid.
[00:33:23.280 --> 00:33:26.160] Like, if that's true, like, I started reading these numbers.
[00:33:26.400 --> 00:33:26.960] This is b.
[00:33:26.360 --> 00:33:28.080] Like, this is like no one's getting paid out of the marketing.
[00:33:28.240 --> 00:33:29.120] What are the numbers?
[00:33:29.680 --> 00:33:40.680] I thought it was $100 million were like the complete package, including equity and salary and all this stuff, to like for multiple AI researchers as a salary to employ.
[00:33:41.080 --> 00:33:43.800] Yeah, it's unprecedented.
[00:33:43.880 --> 00:33:50.840] Hey, Zach, if you're listening to this, I would totally leave behind Tiny Seed for a $100 million package.
[00:33:51.080 --> 00:33:57.880] If you need a podcaster and a voice of AI, really opinionated Norwegian.
[00:33:57.880 --> 00:34:00.600] I actually, I think I'd probably do it for like 99.
[00:34:00.760 --> 00:34:02.280] Yeah, give you a discount.
[00:34:02.280 --> 00:34:03.080] Oh my gosh.
[00:34:03.080 --> 00:34:03.560] Yeah.
[00:34:03.560 --> 00:34:04.760] A special discount.
[00:34:05.160 --> 00:34:05.400] Yeah.
[00:34:05.400 --> 00:34:14.280] And to be clear about the previous story with Windsurf, I obviously, more than anyone, I don't begrudge any founder to want to change their life or to get rich building a company.
[00:34:14.280 --> 00:34:19.080] I think capitalism and democracy have done a pretty decent job of that.
[00:34:19.080 --> 00:34:21.160] And, you know, if anyone believes in it, I do.
[00:34:21.160 --> 00:34:25.400] But this in particular, these kind of edge cases where people get screwed, I think, is a real problem.
[00:34:25.400 --> 00:34:29.160] And my hope is that it's not something that continues.
[00:34:29.160 --> 00:34:42.840] Our last story of the day is more of a SAS-focused, not news of the world, but it is an article on founderlabs.io: how to sell if your user is not the buyer.
[00:34:42.840 --> 00:34:46.280] For example, your developer is your user, but the CTO is the buyer.
[00:34:46.280 --> 00:34:48.760] And this is by Nate Ritter.
[00:34:48.760 --> 00:35:01.080] And we, of course, see this with Tiny Seed companies where it's the easiest thing to do is if you are a bootstrap founder, you are patient zero, meaning you have the need, you validate it, and then you build it for you.
[00:35:01.080 --> 00:35:07.480] You know exactly who uses it, how they use it, and then you sell it to those people, and it's one user, and they make the decision.
[00:35:07.480 --> 00:35:10.520] That is the best scenario with B2B SAS.
[00:35:10.520 --> 00:35:14.280] But it's, I would say, probably the minority of SaaS that we see.
[00:35:14.280 --> 00:35:19.680] And so, at a certain point, you do have to learn how to sell to folks who are not your end user.
[00:35:19.680 --> 00:35:22.960] And Nate talks through some thoughts and best practices.
[00:35:22.960 --> 00:35:25.520] Tracy, what are your thoughts on his piece?
[00:35:25.520 --> 00:35:26.320] I love this.
[00:35:26.320 --> 00:35:29.520] I think it's, like I said, we see this a lot with Tiny Seed.
[00:35:29.520 --> 00:35:34.800] And a lot of people are like, you know, be like, why can't it be easier than this?
[00:35:34.800 --> 00:35:37.440] Like, why can't we, like, why can't these people just pay for my tool?
[00:35:37.440 --> 00:35:42.720] But they have to go through all these hoops where they have to talk to their manager and their manager has to talk to the buying team.
[00:35:42.720 --> 00:35:54.960] And like, and they're like, well, by talking to this developer that's saying how this would be such a great win, but then they, as the founder and as the owner of this tool, are divorced from being able to be in the room with those decision makers.
[00:35:54.960 --> 00:36:02.960] This article does a really good job, I think, of explaining like what you could do as the SAR founder in your product to make it easier.
[00:36:02.960 --> 00:36:20.560] A couple of things that I mentioned I thought were really great was, you know, if you think through this and you're able to build ways, tools and resources for that, the non-buyer, the person who's using your product, make them tools and resources or things that they need to convince their leadership, you should do that.
[00:36:20.560 --> 00:36:22.720] Like think through, think like three steps ahead.
[00:36:22.720 --> 00:36:33.920] You have to do a harder sales process in that way and think through what the entire sales cycle is like and see through what tools you have on your end that you can give to those folks to help them make the deal and just talk to people.
[00:36:33.920 --> 00:36:35.840] But I know those are just kind of like band-aids.
[00:36:35.840 --> 00:36:40.880] I'm sure Einar has some thoughts here as a true salesperson on this three-person team.
[00:36:40.880 --> 00:36:42.560] Enar, what do you think?
[00:36:42.560 --> 00:36:44.560] I mean, I thought it was a good article.
[00:36:44.560 --> 00:36:46.400] You know, there's a bunch of stuff to this.
[00:36:46.640 --> 00:36:51.520] I think for me, it's more like, yeah, I don't disagree on this.
[00:36:51.520 --> 00:37:08.280] And I think at a higher level, I think a lot of, particularly bootstrap founders, they struggle conceptually or culturally, or whoever you pointed out, this notion that they kind of just want to, they just, a lot of them, not all, but they kind of just want to, they just want to like make money when they're sleeping.
[00:37:08.280 --> 00:37:13.400] Like they just, they just, they don't like buying, they're not the typical enterprise software buyers.
[00:37:13.400 --> 00:37:20.840] And so they can't believe that somebody doesn't want to go to like a sales page and there's the pricing and you just click and put your credit card in.
[00:37:20.840 --> 00:37:25.880] And like, why would, why isn't all software in the entire world, why isn't everything like this?
[00:37:26.200 --> 00:37:33.080] They think it's a dark pattern to hide pricing and like call sales is like a is a dirty word and like all this stuff.
[00:37:33.240 --> 00:37:41.000] And the fact of the matter is like almost 100% of the time, let's, okay, let's call it 80, 90% of the time.
[00:37:41.000 --> 00:37:43.000] If you think you're different, you're probably not.
[00:37:43.000 --> 00:37:55.080] You can get to, I would say, 1 million, 2 million, 3 million of ARR, somewhere in there for a lot of vertical SaaS businesses with that sort of self-serve model.
[00:37:55.080 --> 00:38:03.480] Like you can differentiate the pricing and like, you know, you got to figure out, there's a lot of stuff, don't get me wrong, there's a lot of stuff you got to get right to get to that level too.
[00:38:03.480 --> 00:38:18.280] But a lot of businesses, once they get to that stage, if you want to keep increasing your growth and take from that one, two, three million up to like five, 10, 15, 20, you really need to start, you know, unlocking enterprise sales.
[00:38:18.280 --> 00:38:22.120] And like what that means is dealing with issues like this.
[00:38:22.120 --> 00:38:26.280] And the fact of the matter is, you know, a lot of founders just don't want to do that.
[00:38:26.280 --> 00:38:34.920] And actually, like, and shout out to Jason Cohen, friend of the show, who he put out a new metric here, the max MRR, that I thought was quite interesting.
[00:38:34.920 --> 00:38:42.600] And it's this notion that, like, you know, given where you're at in terms of your churn rate and like all this stuff, like, you know, how much, you know, how much new MRR are you booking?
[00:38:42.600 --> 00:38:44.200] Like, what is your exit max MRR?
[00:38:44.200 --> 00:38:48.240] Where are you gonna, where's your asymptote to pull back to an earlier point?
[00:38:44.680 --> 00:38:50.960] And the fact of the matter is, like, you can do that math earlier on.
[00:38:51.120 --> 00:39:01.920] Like, you can know, like, given where I'm at now and the kind of new MRR I'm booking and the churn rate that I've got, once you get to a certain size, you can do the math and know where you're going, where you're gonna tap out of that.
[00:39:01.920 --> 00:39:04.800] And fundamentally, you need to bend that curve.
[00:39:04.960 --> 00:39:06.560] You need to change those metrics.
[00:39:06.560 --> 00:39:11.520] And those metrics typically end like, okay, you're landing larger accounts who churn less.
[00:39:11.520 --> 00:39:13.040] That's just the way it is.
[00:39:13.040 --> 00:39:19.040] And what also pisses me off sometimes is like, I see these founders, they get to like 1 million, 2 million, whatever.
[00:39:19.040 --> 00:39:21.360] And they're like, this is like, I can't figure it out.
[00:39:21.360 --> 00:39:22.960] Or like, I just don't want to do anything more.
[00:39:22.960 --> 00:39:29.200] And like, you know, then basically, if you don't do anything more, you'll stall out and your business is worth less and less, even though you're maybe growing slightly.
[00:39:29.200 --> 00:39:33.920] Or they're like, I'm going to sell to this private equity group that comes through the door and they're like, whatever they're paying them, but I don't care.
[00:39:34.000 --> 00:39:34.960] Like, it's for good money.
[00:39:34.960 --> 00:39:53.600] And the fact of the matter is, like, this is like a lot of these groups, what they do, this is what they know how to do is like they figured out, look, if you got it zero to one and you got it to a million or two or three, and you didn't build out an enterprise sales effort, and it's clear that it could do so much, so much of the value of what you created compounds to them very quickly.
[00:39:53.600 --> 00:40:04.640] You could have spent five or six years getting it to one or two million, and then they buy it, and then two, three years later, they make 80% of the money.
[00:40:04.640 --> 00:40:13.200] Yeah, this is why when founders are listening to advice on the internet, you need to ask, what's the source and what are their goals?
[00:40:13.200 --> 00:40:18.000] And it used to be, well, we'd see all this funded, like the VC-backed advice for startup founders.
[00:40:18.000 --> 00:40:21.520] Like, you have to go after a billion-dollar market, and you know, you have to, have to, have to.
[00:40:21.520 --> 00:40:24.960] And Bootstrap founders would read this and be like, I have to go after a billion-dollar market, right?
[00:40:24.960 --> 00:40:27.440] And that's where I came in and was like, no, stop doing that.
[00:40:27.440 --> 00:40:28.800] Like, you're a bootstrapper.
[00:40:28.800 --> 00:40:31.960] You don't need all the stuff that you see venture-backed folks saying.
[00:40:32.040 --> 00:40:39.720] And that's why, if you listen to any of the big VCs, like there's usually some nuggets of truth, but there's a lot of bias towards becoming a unicorn or a decacorn, right?
[00:40:39.720 --> 00:40:42.200] And a lot of that stuff will actually break a bootstrap company.
[00:40:42.200 --> 00:40:51.160] But even within bootstrapping, I had this realization, well, it was about eight or nine years ago that there are the lifestyle bootstrappers and the ambitious bootstrappers.
[00:40:51.160 --> 00:40:52.120] This is the way I term it, right?
[00:40:52.120 --> 00:41:04.760] And lifestyle bootstrappers are more like the indie hackers, where it's like, I don't want to do stuff I don't want to do, and I just want a business that's, I will, I'm willing to limit my growth and my aspirations based on my comfort zone, how much I want to work, whatever it is.
[00:41:04.760 --> 00:41:06.440] I want my lifestyle to be number one.
[00:41:06.440 --> 00:41:09.560] I was that until about 2011, 2012.
[00:41:09.560 --> 00:41:13.720] And then when we started drip, I realized, oh, I have a tiger by the tail.
[00:41:13.720 --> 00:41:15.000] This could be life-changing.
[00:41:15.000 --> 00:41:15.800] I'm going to switch.
[00:41:15.800 --> 00:41:18.520] And I went into a different mode where I then sacrificed some things.
[00:41:18.520 --> 00:41:20.120] I work more than I should have.
[00:41:20.120 --> 00:41:27.240] I sacrificed, you know, some of my quality of life for a few years in order to get rich is basically what the, you know, the thing was.
[00:41:27.240 --> 00:41:31.560] And that's, I think, something that folks miss online.
[00:41:31.560 --> 00:41:33.160] And you'll see it on X Twitter.
[00:41:33.160 --> 00:41:36.760] You'll see an indie hacker arguing with someone who wants to build a $10 million SaaS.
[00:41:36.760 --> 00:41:40.120] And really, what you need to do are different in those two cases.
[00:41:40.120 --> 00:41:41.800] And so you can say, well, you should never do this.
[00:41:41.800 --> 00:41:42.360] You should always do this.
[00:41:42.360 --> 00:41:47.400] And it's like, well, not if I'm designing my life and my company a little differently than yours.
[00:41:47.720 --> 00:41:48.840] I wanted to pop in here.
[00:41:49.160 --> 00:41:52.840] Just kind of triggered a little thought in my head in and around sales.
[00:41:52.840 --> 00:41:57.000] That I've been doing sales for our new product in Tiny City.
[00:41:57.000 --> 00:41:57.880] We have a SaaS Institute.
[00:41:57.880 --> 00:42:03.000] We're working with 1 million plus ARR founders and bringing, like, we have the accelerator.
[00:42:03.000 --> 00:42:07.720] We have this process of the way that we grow our companies with the accelerator that we've been doing for the last six years.
[00:42:07.720 --> 00:42:16.640] We have this new product that we're, where it's like mentorship, advisors, and coaching, facilitated, coaching, facilitated masterminds in Sass Institute.
[00:42:16.640 --> 00:42:17.840] And I'm doing the sales for that.
[00:42:17.840 --> 00:42:20.080] And this is brand new to me.
[00:42:14.840 --> 00:42:21.760] As someone, I love to build products.
[00:42:21.920 --> 00:42:23.120] I haven't really done sales before.
[00:42:23.120 --> 00:42:27.680] I've been talking to founders within Tiny Seed for the last six years about how much people hate sales.
[00:42:27.680 --> 00:42:34.480] And the thing that really changed my perspective on this is that thinking about sales sounds kind of silly, but it's like sales is everywhere.
[00:42:34.480 --> 00:42:48.160] Sales is just understanding the perspective of the person you're talking to, understanding what they need, and figuring out on your end how to give it to them or how to match their needs.
[00:42:48.160 --> 00:42:56.560] And so, if you're talking to someone who's not necessarily a buyer, how do you give that person who is not the buyer the tools that they need to make this happen?
[00:42:56.560 --> 00:43:07.920] When I'm talking to someone and trying to convince them to spend a couple thousand dollars a month to join our coaching program, what can I say to reassure them of the value of that?
[00:43:07.920 --> 00:43:10.720] What do they need in order to be assured of that value?
[00:43:10.720 --> 00:43:17.120] And this kind of like has leaked into my thoughts for everything I do when I'm talking to Rob about my job.
[00:43:17.120 --> 00:43:18.240] A lot of that's sales too.
[00:43:18.240 --> 00:43:19.520] Like, what does Rob?
[00:43:19.520 --> 00:43:20.880] He's shaking his head at me.
[00:43:20.880 --> 00:43:22.080] What does Rob need, right?
[00:43:22.080 --> 00:43:24.480] What is, what is, what do Rob and Anart need?
[00:43:24.480 --> 00:43:34.880] And when I'm talking about what I'm doing in my job and when I want something, how do I phrase that in a way that I am reassuring them of the things that they need, but I'm also getting what I want?
[00:43:34.880 --> 00:43:42.880] And so when founders come to me and they say, like, oh, I don't like to do sales, I think that an interesting way of rethinking it is it's not a scary process.
[00:43:42.880 --> 00:43:45.440] It's literally, you kind of have to think one step ahead.
[00:43:45.440 --> 00:43:48.800] How do I, the person I'm talking to, how do I get them what they need?
[00:43:48.800 --> 00:43:50.160] How do I understand what they need?
[00:43:50.160 --> 00:43:51.440] And how do I get it to them?
[00:43:51.440 --> 00:43:54.160] And then sales becomes a lot easier when you think of it that way.
[00:43:54.480 --> 00:43:55.920] Tracy Osborne.
[00:43:55.920 --> 00:43:58.320] People can find you on the internet at TracyOsborne.com.
[00:43:58.320 --> 00:44:01.640] And of course, TracyOsborne.com on Blue Sky.
[00:44:01.640 --> 00:44:06.600] You're broadcasting to us live from the gift shop of a 1980s ski lodge.
[00:44:06.600 --> 00:44:09.640] Thank you so much for making the show today.
[00:44:09.640 --> 00:44:11.000] It's true.
[00:44:11.640 --> 00:44:12.920] Glad to be here.
[00:44:12.920 --> 00:44:20.920] Anar Volcet, you, of course, are hosting and arguing about the Giants on ex-Twitter at Anar Volcet.
[00:44:20.920 --> 00:44:24.040] We will link that up in the show notes.
[00:44:24.440 --> 00:44:27.080] Do not judge us by his Twitter feed.
[00:44:27.080 --> 00:44:28.200] Yeah, no, no, no.
[00:44:28.440 --> 00:44:32.920] His opinions do not reflect that of Tracy or Tiny Seed.
[00:44:33.240 --> 00:44:42.520] Einar is, of course, broadcasting from a discrete garage where he looks like he's about to hold the annual meeting of his Rotary Club.
[00:44:42.520 --> 00:44:45.480] So we will wrap it up here.
[00:44:45.480 --> 00:44:47.080] Einar, thanks so much for joining.
[00:44:47.080 --> 00:44:48.920] Sweet burns at the end.
[00:44:48.920 --> 00:44:49.800] Sweet burns.
[00:44:49.800 --> 00:44:56.920] The other joke was: I'm expecting to see your bowling trophy collection appearing just off camera inside the wood panel behind you.
[00:44:56.920 --> 00:44:59.720] All that said, you two, thanks for joining.
[00:44:59.720 --> 00:45:01.560] Yeah, glad to be here.
[00:45:01.560 --> 00:45:02.840] Thanks for having me.
[00:45:02.840 --> 00:45:05.800] Hope you enjoyed this episode of Hot Take Tuesday.
[00:45:05.800 --> 00:45:10.920] If you like this show format, please give me a shout on X Twitter.
[00:45:10.920 --> 00:45:12.280] I'm at Rob Walling.
[00:45:12.280 --> 00:45:20.040] And if you look at the StartupsPod Twitter account, you'll see a tweet with a video clip from this week's episode that I typically retweet.
[00:45:20.040 --> 00:45:22.920] Just chime in there and say, I really enjoyed this episode.
[00:45:22.920 --> 00:45:24.120] You should do more of these.
[00:45:24.120 --> 00:45:25.720] Because I don't know the last time we did one.
[00:45:25.720 --> 00:45:26.760] It was probably six months.
[00:45:26.760 --> 00:45:32.360] And I've gotten a couple requests over time to do it, but I often just kind of forget about doing them.
[00:45:32.360 --> 00:45:35.480] So, if you want to hear more of these, let me know on Twitter.
[00:45:35.480 --> 00:45:37.400] Thanks for tuning in this week and every week.
[00:45:37.400 --> 00:45:41.240] This is Rob Walling signing off from episode 792.
Prompt 2: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 3: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Prompt 4: Media Mentions
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Full Transcript
[00:00:00.160 --> 00:00:02.800] You've stumbled upon another episode of Startups for the Rest of Us.
[00:00:02.800 --> 00:00:16.800] I'm Rob Walling, and in this Hot Take Tuesday, I welcome back Enar Volcet and Tracy Osborne as we talk about chat GPT-5 and the struggles they've had with that release, whether we're in an AI bubble, and a few other relevant news items.
[00:00:16.800 --> 00:00:19.600] One of them actually deeply relevant to SaaS sales.
[00:00:19.600 --> 00:00:25.760] Before we dive into the episode, I wanted to let you know that TinySeed applications are open for the fall 2025 batch.
[00:00:25.760 --> 00:00:37.040] If you're a B2B SaaS founder with at least $1,000 in MRR and you're looking for the right amount of funding, a community of ambitious, like-minded founders, and a network of world-class mentors, you should apply.
[00:00:37.040 --> 00:00:45.200] If you have any questions about the program or the application process, we're also doing a live QA with the Tiny Seed team this Wednesday, September 3rd, on YouTube.
[00:00:45.200 --> 00:00:46.960] We'll link that up in the show notes.
[00:00:46.960 --> 00:00:51.360] If you know your metrics, the application only takes about 10 or 15 minutes to complete.
[00:00:51.360 --> 00:00:53.840] Applications close on September 9th.
[00:00:53.840 --> 00:00:57.760] You can get all the details at tinyseed.com/slash apply.
[00:00:57.760 --> 00:00:59.520] All right, let's get into it.
[00:01:06.560 --> 00:01:10.320] Welcome back to Hot Take Tuesday.
[00:01:12.560 --> 00:01:17.280] I have two guests, my recurring panel, and this is by popular demand.
[00:01:17.280 --> 00:01:23.440] And by that, I mean two people have mentioned it to me in the past six weeks saying, why don't you do Hot Take Tuesday anymore?
[00:01:23.440 --> 00:01:24.880] And I was like, I don't know.
[00:01:24.880 --> 00:01:25.840] People have been traveling.
[00:01:25.840 --> 00:01:26.640] It's been summer.
[00:01:26.640 --> 00:01:27.760] So here we are.
[00:01:27.760 --> 00:01:34.960] We're going to cover stories ranging from, well, you know, doesn't everyone want to hear us talk about American politics and crypto?
[00:01:34.960 --> 00:01:36.640] No, we're not going to do either of those things.
[00:01:36.640 --> 00:01:38.400] We're going to discuss Kirkhop.
[00:01:38.720 --> 00:01:39.280] He did.
[00:01:39.280 --> 00:01:41.120] He's like, I want to talk about both.
[00:01:41.120 --> 00:01:43.840] GPT-5, is it worse than 4-0?
[00:01:43.840 --> 00:01:48.560] Are we entering an AI bubble with the limits that folks are starting to put on it?
[00:01:48.560 --> 00:01:54.080] Windsurf, and that whole debacle of employees getting hosed and more topics.
[00:01:54.080 --> 00:02:00.440] First, I'd like to introduce my panelists, the head of product for TinySeed and MicroConf.
[00:02:00.440 --> 00:02:03.000] Tracy Osborne.
[00:02:03.960 --> 00:02:05.240] That was awful.
[00:01:59.920 --> 00:02:06.600] That was awful.
[00:02:08.840 --> 00:02:10.520] Do you want to do that again?
[00:02:11.800 --> 00:02:13.320] Welcome to the show.
[00:02:13.320 --> 00:02:18.360] I was going to say, last time I was here, I was wondering if you've realized I have a title change since the last time at this time.
[00:02:18.360 --> 00:02:19.640] It's been that long.
[00:02:20.520 --> 00:02:21.560] It's been a while.
[00:02:21.560 --> 00:02:25.720] You used to be Tracy Makes on Twitter, and that's where we'll leave it.
[00:02:25.720 --> 00:02:27.480] No, wait, you're Trace at Tracy Osbone.
[00:02:27.560 --> 00:02:27.880] Where are you?
[00:02:27.880 --> 00:02:30.040] TracyOsbone.com on this guy.
[00:02:30.200 --> 00:02:30.840] I am.
[00:02:30.840 --> 00:02:31.400] Okay.
[00:02:31.800 --> 00:02:36.920] And we're still counting down the days until you owe AR $1,000 and buy us both sushi dinner.
[00:02:36.920 --> 00:02:37.720] That was in there, right?
[00:02:37.720 --> 00:02:38.680] Wasn't it a sushi dinner?
[00:02:39.160 --> 00:02:43.400] We never determined what all the terms were between Twitter and between.
[00:02:43.480 --> 00:02:48.440] What is three years, and I think it was $100 million.
[00:02:48.840 --> 00:02:50.360] Sorry, that wasn't even active.
[00:02:50.920 --> 00:02:52.440] It was like 50 or 100, and we kind of bounced.
[00:02:52.440 --> 00:02:53.080] I think it's 100%.
[00:02:53.160 --> 00:02:54.760] Elon Musk has a lot to go.
[00:02:54.760 --> 00:02:56.680] So I sort of remain hopeful.
[00:02:57.240 --> 00:03:09.080] Other person chiming in is our very own Annar Volcet, co-founder of Tinyseed, the founder of Discretion Capital, the best sell-side MA firm for SAS, between 2 and 20 million ARR.
[00:03:09.080 --> 00:03:10.920] Ann R Volcet, welcome to the show.
[00:03:10.920 --> 00:03:12.200] Thanks for having me.
[00:03:12.200 --> 00:03:14.040] All right, let's dive in to our first story.
[00:03:14.040 --> 00:03:15.720] This is actually story zero.
[00:03:15.720 --> 00:03:23.480] And we've been talking about this for the past couple weeks and getting a lot of positive responses as one would expect on X Twitter and the like.
[00:03:23.480 --> 00:03:35.080] TinySeed, our startup accelerator for ambitious B2B SaaS founders, has returned fund one, meaning, we have returned more capital to our investors, RLPs, than they invested.
[00:03:35.080 --> 00:03:36.440] This is a six-year-old fund.
[00:03:36.440 --> 00:03:46.400] And ANR, I believe, with other benchmarks or with the benchmarks across venture capital funds, we are in the top 10%, potentially top 5% of funds of that vintage.
[00:03:46.400 --> 00:03:49.600] So, you want to talk, you know, why is this a big deal?
[00:03:44.840 --> 00:03:52.080] Because I've never run a fund before, and I was like, oh, that's neat.
[00:03:52.240 --> 00:03:53.520] And then I'm like, no, that's not neat.
[00:03:53.520 --> 00:03:54.880] Like, this is a big deal.
[00:03:54.880 --> 00:03:55.840] Why is that?
[00:03:55.840 --> 00:03:56.640] It's a big deal.
[00:03:56.640 --> 00:04:01.680] I mean, I think it puts us in the top 10, if not top 95% of that vintage.
[00:04:01.680 --> 00:04:04.480] And it's not just like, oh, that was a really crappy vintage.
[00:04:04.480 --> 00:04:14.640] The fact of the matter is that, you know, venture funds have gotten to the point where, like, you know, traditionally, partly because of how, you know, tech funds, tech companies stay private for longer, that sort of thing.
[00:04:14.640 --> 00:04:23.280] Like, you know, they stay, they get longer and longer and longer before they return, return any funding and certainly before they start returning the fund and multiples on that.
[00:04:23.280 --> 00:04:32.560] So actually, I just saw there was a, there was a stat out about CARDA where it says like, you know, basically on, on, and DPI is the term, distributions to paid in.
[00:04:32.560 --> 00:04:37.680] And basically says, like, you know, over 12 years in, only about half the funds have returned 1x DPI.
[00:04:37.680 --> 00:04:47.360] So what that means is if you, if you invested in all the venture funds out there, you know, and you held it for 12 years, only about half those funds would even have returned your capital.
[00:04:47.360 --> 00:04:49.280] So yeah, it's a big deal.
[00:04:49.280 --> 00:04:56.720] And also, I think crucially for us, it's not just like, you know, you can return the fund and then have nothing left, in which case, like, yeah, that wasn't so great.
[00:04:56.720 --> 00:05:03.520] But the fact is, like, for us, for fund one, you know, we still have, you know, the majority of the assets are still growing.
[00:05:03.520 --> 00:05:06.880] And sort of it's a very good sign very early on.
[00:05:06.880 --> 00:05:14.800] Like, VC is kind of funny in the sense that like one career where like people don't really know and you don't really know if you're any good until 10 years in.
[00:05:15.760 --> 00:05:18.880] So it's a good place for charlatans and fast talkers.
[00:05:18.880 --> 00:05:21.120] So we'll see how, we'll see if that includes us.
[00:05:21.120 --> 00:05:22.880] But yeah, it's a very good sign.
[00:05:22.880 --> 00:05:24.880] And yeah, I feel very good about it.
[00:05:24.880 --> 00:05:29.360] And also, crucially, like, look, we return, you know, the money of the people who trusted us the first time around.
[00:05:29.360 --> 00:05:30.440] Like, this was fund one.
[00:05:30.760 --> 00:05:32.760] Like, it was an unproven thing.
[00:05:29.840 --> 00:05:36.840] You know, a lot of people call this kind of like this is really stupid, like, it's not going to work out.
[00:05:37.080 --> 00:05:39.240] Like, you're never even going to get your money back.
[00:05:39.240 --> 00:05:48.760] And at this point, we proven wrong, and we can, you know, tell those people to go themselves and give the people more than the money that they take back.
[00:05:48.760 --> 00:05:51.800] They've already gotten a return on their capital with more to come.
[00:05:51.800 --> 00:05:52.920] So that feels good.
[00:05:52.920 --> 00:05:54.280] Yeah, really exciting.
[00:05:54.280 --> 00:05:55.240] Tracy?
[00:05:55.560 --> 00:06:06.840] The thing I didn't realize before I started working here, I, you know, because before I was a startup founder, I tried, I raised some money from VCs, but I didn't really quite understand how VC economics worked.
[00:06:06.840 --> 00:06:18.440] And the return of the fund is super exciting because compared to startups where you're like, you're doing super well and you can reinvest those profits into the business and continue growing, we have to continually raise new funds.
[00:06:18.440 --> 00:06:39.960] So if we have this massive success on one of our, on our first fund that proves out our thesis and proves that what we're trying to do is working and is working really well, that means we can raise more funds and we need to raise more funds in order to exist because we can't rein like all the returns and whatnot, it's not coming back the tiny seed for us to invest in other people.
[00:06:39.960 --> 00:06:41.800] We have to raise new funds in order to do that.
[00:06:41.800 --> 00:06:47.320] And then we live off those, you know, run the company essentially off those management fees.
[00:06:47.320 --> 00:06:54.680] So on one hand, it's like it's super awesome to return that fund, but it's also for me, it's like, yay, I get to have a job for longer because of this, you know?
[00:06:55.320 --> 00:06:56.440] I like that part.
[00:06:56.760 --> 00:06:58.120] It's also, it's also pretty strange.
[00:06:58.200 --> 00:07:01.160] It's like, look, like, like, investing is like this.
[00:07:01.160 --> 00:07:08.120] If you, if you, you take people's money and you're basically saying, I'm better at, I know more, better than you what to do with your money.
[00:07:08.120 --> 00:07:13.080] And like, it's a better, you know, you should give your money to me instead of doing it, doing whatever you want to do with it yourself.
[00:07:13.080 --> 00:07:20.560] And like being able to prove that out with actual cash in the bank instead of like arguing about markups and paper returns and all this stuff feels really good.
[00:07:20.880 --> 00:07:22.880] That's the thing because we came out with a thesis.
[00:07:22.880 --> 00:07:31.440] A, that bootstrap B2B SaaS founders would even take money because there was a whole conversation around this in 2017, 18 as I started talking about this at MicroConfin.
[00:07:31.520 --> 00:07:35.120] There was a real pushback from a small group, but like, why would they take your money?
[00:07:35.120 --> 00:07:35.920] Why would they give up equity?
[00:07:35.920 --> 00:07:36.720] They should just bootstrap.
[00:07:36.720 --> 00:07:37.760] So there was that thesis.
[00:07:37.760 --> 00:07:41.920] Then there's a thesis of does this return even close to the SP 500, right?
[00:07:41.920 --> 00:07:49.040] Because if it doesn't return that over, you know, what, 10 years or whatever the horizon is, no one's going to give us more money and it's not viable.
[00:07:49.040 --> 00:07:50.640] And so those were our theses.
[00:07:50.640 --> 00:07:55.360] And obviously the first one has been proven because we funded 204 companies in the past six years.
[00:07:55.360 --> 00:07:57.520] And the second one, the signs are good.
[00:07:57.520 --> 00:08:01.680] We've returned the fund and as you said, more than the initial investment.
[00:08:01.680 --> 00:08:05.120] So folks are already getting a return and there's still a lot of upside.
[00:08:05.120 --> 00:08:10.240] With that in mind, fund three, tiny seed fund three is closing soon.
[00:08:10.240 --> 00:08:11.600] We have a little bit more room.
[00:08:11.600 --> 00:08:25.760] And if you are an accredited investor anywhere in the world and would like to put some money to work in hand-picked, high-quality, I would say some of the best deal flow in B2B SaaS in the world, head to tinyc.com/slash invest.
[00:08:25.760 --> 00:08:26.880] And you can read our thesis.
[00:08:26.880 --> 00:08:30.320] And if you fill out that form there, it goes directly to Einar's inbox.
[00:08:30.320 --> 00:08:31.520] And he will reach out.
[00:08:31.520 --> 00:08:38.480] Before we move on from this topic, Ainar, the last post I put on XTwitter got a question from Daniel Westendorf.
[00:08:38.480 --> 00:08:41.200] And he asked, are you able to share the distribution of returns?
[00:08:41.200 --> 00:08:45.280] Like, was the majority from a few portfolio exits or something else?
[00:08:45.280 --> 00:08:50.480] And obviously, we don't give exact numbers because we don't need to because we're a private fund, but what's your sentiment around this?
[00:08:50.800 --> 00:08:52.720] Yeah, the short answer is yes.
[00:08:53.040 --> 00:08:58.480] It's exits or partial exits from one or two or a small number of the portfolio.
[00:08:58.480 --> 00:09:06.920] And that's sort of like very much in line with the thesis that we wrote ahead of Actually Fund 2, which is like, look, this is still a power law type environment.
[00:09:07.160 --> 00:09:12.360] It's not like, you know, all the companies are going to grow uniformly at the same rate.
[00:09:12.360 --> 00:09:18.760] There's still going to be some companies that outperform to the point where like, yeah, it's obvious that this is web drives performance.
[00:09:18.760 --> 00:09:21.800] And that's sort of been proven out for us as well.
[00:09:22.120 --> 00:09:26.440] And if anyone's curious about what this all means, we do have this on our website.
[00:09:26.440 --> 00:09:28.680] It's like tinyc.com/slash thesis.
[00:09:28.680 --> 00:09:32.680] So if you are a data nerd, definitely check that out because I think it's really interesting.
[00:09:32.680 --> 00:09:40.440] Again, coming from someone coming from the startup world, coming into VC, seeing how these things kind of work behind the scenes, check out our thesis if you want to read more.
[00:09:40.440 --> 00:09:49.560] All right, moving on to our next story, something that's on a lot of folks' minds right now is GPT-5 came out, what, two weeks ago?
[00:09:49.560 --> 00:09:50.840] And there is a lot of talk.
[00:09:50.840 --> 00:09:59.880] I mean, we have a Y Combinator thread or a Hacker News thread that we'll put in the show notes, but you can go almost anywhere on the internet and just search for is GPT-5 worse than 4.0.
[00:09:59.880 --> 00:10:02.840] And it does seem like there's some a little bit of friction online.
[00:10:02.840 --> 00:10:10.600] Tracy, do you have thoughts you want to kick us off on whether you've, you know, have you have first-hand experience with it or what's your sentiment around what's going on?
[00:10:10.600 --> 00:10:15.560] I think it's, I'm definitely not a power user, but it was really interesting yesterday.
[00:10:15.560 --> 00:10:21.400] Well, I guess in the last week, because 5 launched and you couldn't choose what model it was using.
[00:10:21.400 --> 00:10:29.000] It was choosing in the back end, you know, whether it's going to be using the slower, more expensive, whatever, robot to answer your questions or the fast, quick one.
[00:10:29.000 --> 00:10:32.120] And then just yesterday, I was using it again, and they added those two things.
[00:10:32.120 --> 00:10:39.400] I was divorced from the discussion on the internet about which one's better or the other, but I was like, oh, thank goodness that I finally choose.
[00:10:39.400 --> 00:10:40.360] I don't want it to.
[00:10:40.360 --> 00:10:43.160] For me, I didn't want it to choose, and I wanted to pick.
[00:10:43.160 --> 00:10:49.120] And then I found this whole wash of information of people screaming about different models and like which one was the better one.
[00:10:49.200 --> 00:10:57.920] And then people, it's understandable if you're not able to choose which one, whether you want it to be fast or slow, or using the expensive cycles or not.
[00:10:57.920 --> 00:11:04.240] Because if you're on a paid account, you know, there's this expectation of having the most expensive version.
[00:11:04.240 --> 00:11:11.920] So I can kind of see why they like monkey patch this back in, this like the thinking, I guess they call it thinking or fast or whatever the other two are.
[00:11:11.920 --> 00:11:15.920] Anyway, that's my experience with just kind of seeing this in the last week, being like, oh, goodness.
[00:11:15.920 --> 00:11:19.680] And then finding this mic on their front and be like, whoa, it's huge here.
[00:11:19.680 --> 00:11:21.680] I am sure Anar has things to say though.
[00:11:21.680 --> 00:11:24.080] So I think I should pass it over to him.
[00:11:24.080 --> 00:11:25.920] Yeah, I mean, yeah, I have opinions.
[00:11:26.240 --> 00:11:28.000] There's a couple of different things here, right?
[00:11:28.000 --> 00:11:35.280] Like, you know, is ChatGPT 5 better than 04 or 4.0 or whatever, however, whatever the name is?
[00:11:35.440 --> 00:11:36.400] I mean, I think it is.
[00:11:36.800 --> 00:11:37.920] I use it a fair bit.
[00:11:37.920 --> 00:11:42.800] And I think there's no doubt in my mind that, you know, GPT-5 is very good.
[00:11:42.800 --> 00:11:58.480] I think one of the more vindicating things for me is that I've been saying for a while, I actually like in the actually in the talk that I was supposed to give at Microconfinola that Tracy you so expertly gave, one of the points I make is like, look, like, and it was, this was about like, will AI kill SAS?
[00:11:59.120 --> 00:12:03.440] Which has been a big question for the last couple of years since ChatGPT came out.
[00:12:03.440 --> 00:12:27.960] It's like, look, like, everyone's just going to like, it's going to come so smart now so soon that like you're never even going to like need any sas like you just like talk to it or like it's just super intelligence and it's like that there's all this stuff and and my point has been: if you look at the data of how it's how quickly it's improving and like how much it's improving per step, it seems to me to be doing the opposite of what all these ai doomers are talking about.
[00:12:27.960 --> 00:14:00.560] Like, all the ai doomers are talking about, like, oh, you know, it's sort of this sort of idea of foom or like it's sort of exponential takeoff and like it'll soon be like so smart that you know we'll be like ants to it and i'm like well yeah okay but that's not what the data shows the data shows which it's it's really reaching looks to be reaching some kind of asymptote like this asymptote could be very high but it still to me seems like each new model that comes out whether it's from chat gpt or google or or or grok or whatever it seems to be an improvement but it's a smaller improvement smaller step up than the last one and i think you know chat gpt5 just sort of confirms that that view in me and that look like this is something that yeah it's it's an ex an extremely disruptive thing to technology there's a lot of stuff here that that like is disruptive it's it's different it's it's it's capable in a way that you know it's obviously a very new very capable technology but like do i think we're going to be ants to it in two years do i think like we need to bomb data centers because otherwise like humanity has ended because of chat gpt7 like no i mean that's bullshit it always was bullshit like if you looked at the data this was crap honestly when it came out i looked at it i was like yeah there's some really smart people that that basically told me i was an idiot when i pointed this out and and and there are just some fundamental scaling laws underneath this the capability growth that just means that like as you make a like another step in capability like the the amount of compute compute that you need to to add to make it smarter as it were just grows exponentially.
[00:14:00.560 --> 00:14:06.160] So you get like a linear increase in capability with an exponential growth in compute and that just that just doesn't work.
[00:14:06.160 --> 00:14:10.800] Like, you know, like you need a data center the size of the moon to get to the next step just by scaling.
[00:14:10.800 --> 00:14:15.200] And like beyond that, it's like, ah, now it's a data center the size of you know the the galaxy.
[00:14:15.200 --> 00:14:18.560] Like, come on, like, this is, it just doesn't work that way.
[00:14:18.560 --> 00:14:31.600] And so, like, for our listeners, like, to a degree, it must be sort of relieving to see that, you know, GPT-5 isn't superhuman, isn't going to reproduce any SaaS at any time with a flick of a button for 20 bucks a month.
[00:14:31.600 --> 00:14:36.560] And so it's more like, okay, this is an extremely capable technology that, yeah, you need to stay on top of.
[00:14:36.560 --> 00:14:38.960] Like, you need to incorporate it into your SaaS business.
[00:14:38.960 --> 00:14:42.400] You need to incorporate it into your business, or you will be left behind.
[00:14:42.400 --> 00:14:44.240] But you don't need to become a plumber.
[00:14:44.240 --> 00:14:49.040] I mean, nothing wrong with plumbers, but like, it's not like technology as a job is over.
[00:14:49.040 --> 00:14:50.480] So, so, yeah, that's what I think.
[00:14:50.800 --> 00:15:04.880] When Enar had talked about the asymptote, whenever it was, you mentioned that a year ago or six months or something, it was that resonated with me already because the first GPT that I used, the first chat GPT that I used, feels pretty similar to the one that I use today.
[00:15:05.200 --> 00:15:10.240] You know, it's a little better, it's obviously, but like it's improving in a small way.
[00:15:10.240 --> 00:15:16.560] And I broke my own personal rule, which is never read the hacker news comments because this is just a, there's no article here.
[00:15:16.560 --> 00:15:20.640] It's just a chat GPT is slow and no better than four, and it asks a question, right?
[00:15:20.640 --> 00:15:26.880] But I did like the, you know, the first comment says GPT-5 is better as a conversational thinking partner than 4.0.
[00:15:26.880 --> 00:15:30.560] Its answers are more focused, concise, and informative, and it goes on.
[00:15:30.560 --> 00:15:33.040] And probably, you know, that's probably accurate.
[00:15:33.040 --> 00:15:41.920] The thing that it reminds me of is, you know, we are thinking like chat GPT, the new model, is going to be able to do everything better than all the old models.
[00:15:41.920 --> 00:15:44.400] And that's the difference is I think we're getting to this point.
[00:15:44.400 --> 00:15:49.280] I'm probably not the first person to say this, but where there's going to start being GPTs just for this.
[00:15:49.280 --> 00:15:51.360] Like if it's like a brainstorming GPT, right?
[00:15:51.360 --> 00:15:57.920] And it's optimized for that versus like long form copy or short form copy or conversation or coding or, right?
[00:15:57.920 --> 00:15:59.880] There's all these uses people are using it for.
[00:15:59.880 --> 00:16:03.960] And if it was AGI, then maybe it could do all these things, but it's not.
[00:15:59.520 --> 00:16:06.440] And I don't see a near future where that is.
[00:16:06.600 --> 00:16:14.680] So I think the specialization of AI is going to be, of these models already exists, obviously, but I don't know that most people think about it that way.
[00:16:14.680 --> 00:16:16.360] And they just think new model better on everything.
[00:16:16.360 --> 00:16:18.120] And I don't think that's going to happen.
[00:16:18.120 --> 00:16:19.160] I think that's definitely true.
[00:16:19.160 --> 00:16:25.960] I mean, like, it's to the point where like when GPT-5 came out, so I pay for the pro level 200 bucks a month, whatever.
[00:16:25.960 --> 00:16:31.560] And when it came out, I was genuinely a little sad that I couldn't talk to 4.5.
[00:16:31.800 --> 00:16:35.720] That's still my, like, it's almost like it almost has, it has personality, some of it.
[00:16:35.720 --> 00:16:37.160] And I really like 4.5.
[00:16:37.160 --> 00:16:41.160] And so when they discontinued 4.5, I was like, this is a bit right here.
[00:16:41.160 --> 00:16:45.960] Because like, yeah, in some ways, 5 is more capable in some of the tasks that I have it running on.
[00:16:45.960 --> 00:16:49.080] But 4.5 was just a better conversation list.
[00:16:49.080 --> 00:16:49.720] It just was.
[00:16:49.720 --> 00:16:52.440] And so when it got it back, I was like, yes, thank you.
[00:16:53.000 --> 00:16:54.360] I'm extremely happy to have.
[00:16:54.440 --> 00:16:55.000] And that's true.
[00:16:55.720 --> 00:16:57.240] I was a little sad when it disappeared.
[00:16:57.240 --> 00:16:58.360] And now I'm like, yeah, it's back.
[00:16:58.360 --> 00:16:59.560] It's amazing.
[00:16:59.560 --> 00:17:22.120] And then they also, I want to say 4.5 also had, there's some sort of amount of, I don't know, the right word for this is neutering, but it felt like it was kind of pulled back a little bit that they've, in five, now it's, it's, there's some topics it won't talk about, but I believe also at five that it's, it's, hasn't been prevented from talking about certain topics as much as it was in 4.5.
[00:17:22.120 --> 00:17:23.800] Have you run into that, Einar?
[00:17:24.600 --> 00:17:29.000] I thought 4.5 was pretty, actually one of the interesting things is they said, what was it?
[00:17:29.000 --> 00:17:35.080] Oh yeah, only pro users, like to the $200 a month, are going to be able to use 4.5 from PowerN.
[00:17:35.080 --> 00:17:38.280] And because this apparently is an extremely expensive model.
[00:17:38.280 --> 00:17:41.720] And the nice thing about 4.5 is it doesn't have a thinking piece.
[00:17:41.720 --> 00:17:44.960] And it's like, when you're doing back and forth, I don't want you to be thinking.
[00:17:44.440 --> 00:17:48.960] Like, I don't want you to spend like a minute and a half thinking about the response.
[00:17:44.680 --> 00:17:50.240] I need you just to respond.
[00:17:50.720 --> 00:18:00.320] And to me, that seems to be like, it almost feels like 4.5 is like the, which I guess 5 is too, like the 5 fast, maybe, although that does seem a little brain dead compared to 4.5.
[00:18:00.320 --> 00:18:09.440] It's like, 4.5 seemed to me like this is where the current paradigm is able to go with pure scaling without those kind of thinking hacks on top of it.
[00:18:09.440 --> 00:18:11.040] That's what it feels like to me.
[00:18:11.840 --> 00:18:26.480] There was speculation on X Twitter about, maybe it wasn't speculation, maybe it was backed up with data or whatever, but it was kind of like five is less good, but it is because it's cheaper and OpenAI is thinking about cost, you know, and there were folks kind of piping in with that.
[00:18:26.480 --> 00:18:29.760] So maybe that's maybe that's part of it too.
[00:18:31.040 --> 00:18:34.480] Do you wish you had a technical co-founder to help bring your idea to market?
[00:18:34.480 --> 00:18:40.160] Gearheart specializes in helping early stage founders build products users actually want.
[00:18:40.160 --> 00:18:51.440] Their AI-powered approach tests assumptions, validates concepts with real users, and builds scalable MVPs with best practices that won't collapse when your first customers arrive.
[00:18:51.440 --> 00:18:56.800] Founded by entrepreneurs who've launched and exited their own startups, they understand the journey you're on.
[00:18:56.800 --> 00:19:05.280] For the last 13 years, they've helped launch over 70 B2B SaaS platforms, including SmartSuite, which raised $38 million.
[00:19:05.600 --> 00:19:15.280] Book your free strategy session at gearheart.io and mention this podcast to get 20% off validation, discovery, or prototyping for your early stage venture.
[00:19:15.280 --> 00:19:18.000] That's gearheart.io.
[00:19:19.280 --> 00:19:25.040] Our next story is kind of, I'm going to frame it as like, I don't know, are we in an AI bubble?
[00:19:25.040 --> 00:19:27.280] I think the answer is obviously yes, with the amount of money flowing in.
[00:19:27.280 --> 00:19:29.440] But the idea is, you know, what's it going to look like as it pops?
[00:19:29.440 --> 00:19:38.840] And we're going to have, I have a Reddit thread here, but this is another one where AI compute is essentially being given away for far less than it takes to produce, right?
[00:19:38.840 --> 00:19:49.640] OpenAI and Anthropic and all of these are losing hundreds of millions, billions of dollars, whatever it is, because they are buying something for a dollar and giving it away for 20 cents.
[00:19:49.640 --> 00:19:51.720] So there's a couple ways to think about this.
[00:19:51.720 --> 00:19:57.880] I mean, I know people got, developers got pretty angry online when suddenly it was like, wait, they're limiting my, I don't even remember what service it was.
[00:19:57.880 --> 00:19:58.920] Was it Cloud Code or something?
[00:19:58.920 --> 00:20:00.440] It was like, now I have limits on this thing.
[00:20:00.440 --> 00:20:05.560] And it's like, well, of course, because you're probably chewing up $500 worth of AI and you're paying 20 bucks a month.
[00:20:05.560 --> 00:20:06.840] Like, this doesn't make any sense.
[00:20:06.840 --> 00:20:08.600] So, you know, that's kind of a first topic.
[00:20:08.600 --> 00:20:24.680] But the other one is it reminds me, there are a lot of rappers and a lot of SaaS apps that I think where AI is at their core, where all they're doing is taking AI and taking these virtually free, almost free compute credits and doing things with them that if they were actually charged with a cost, it would be completely ridiculous.
[00:20:24.680 --> 00:20:26.920] Like, you know, creating headshots for me, right?
[00:20:26.920 --> 00:20:33.000] Or creating thumbnails or creating internal building or outside vision, any of these small image things or the conversion things.
[00:20:33.000 --> 00:20:39.560] I think if you actually had to charge what that really cost, these are unviable businesses, is my hypothesis.
[00:20:39.560 --> 00:20:42.040] And that's what I want to throw out to the panel: if you have thoughts on that.
[00:20:42.040 --> 00:20:42.920] Let's start with you, Anar.
[00:20:42.920 --> 00:20:44.120] What are your thoughts here?
[00:20:44.120 --> 00:20:45.240] I think it's hard to tell.
[00:20:45.240 --> 00:21:05.080] It's one of those things where I think, like, actually, like, okay, yeah, it's expensive now, but again, I think like a good enough capability is probably achievable on like a self-hosted model for a lot of these businesses in certain times, certainly with like, you know, fine-tuning, and like, as you talked about, like, you get different model specializations.
[00:21:05.080 --> 00:21:12.160] So, so, yeah, maybe these startups can't, they can't just do whatever, just do API calls to the latest model to ChatGPT.
[00:21:12.160 --> 00:21:16.960] And, like, you know, as they crank up the cost there, you know, maybe that becomes unsustainable.
[00:21:17.280 --> 00:21:46.720] But, but I also think, like, okay, if it becomes so expensive that, you know, your model doesn't work, well, then, then, given the specialization you're in, and particularly if it's something like a very specialized vertical SaaS thing that you're doing, okay, you can probably get as good, if not good enough, models by getting your own model, open source model, fine-tuning that, really honing it towards your use case, which, you know, honestly, to be honest with you, like some of these companies probably need to be doing anyway or thinking about it.
[00:21:46.720 --> 00:21:54.960] Once you get to scale and you're a certain size, like, okay, really, you just want to be like, if this, then that, plus API calls to open AI, okay.
[00:21:55.280 --> 00:21:57.600] Well, then you are like, you know, they have you buy the balls.
[00:21:57.600 --> 00:22:08.160] And versus like, if I had a 20 million ARR business that was doing that right now, I'd be like, hmm, yeah, let's spend some money and some time figuring out like, can we have our own open source model?
[00:22:08.160 --> 00:22:13.200] Like, what does it look like to fine-tune that model so that we can host it, so that we can really control the whole stack?
[00:22:13.200 --> 00:22:14.480] Is what I would say.
[00:22:14.480 --> 00:22:16.000] Yeah, there definitely is.
[00:22:16.560 --> 00:22:17.040] It's interesting.
[00:22:17.040 --> 00:22:29.840] I'm going to pass it to you in a second, Tracy, but that makes me think that these underlying AI providers, the LLM API providers, I imagine that they, you know, what happens when they do GPT-cheap?
[00:22:29.840 --> 00:22:34.400] And we know that it's like the 1.0 or 2.0 version, but it's a fifth or a tenth of the price.
[00:22:34.400 --> 00:22:41.520] And to your point, Anar, it's good enough to do a lot of stuff because it was good enough 18, 20, you know, 24 months ago, but it's like way cheaper, right?
[00:22:41.520 --> 00:22:49.520] So that could be a thing of the future where we haven't really gotten to the point of really thinking about cost because I paid 20 bucks a month for chat GPT and I don't even think about it.
[00:22:49.520 --> 00:22:51.920] And if they charge me 50, I would still pay it.
[00:22:52.080 --> 00:22:54.560] But, you know, it's, I don't know, the cost is out of line.
[00:22:54.560 --> 00:22:56.800] I don't think it's anything that a lot of us are thinking about.
[00:22:56.800 --> 00:22:58.880] I don't think about the cost of API these days.
[00:22:59.040 --> 00:22:59.680] How about you, Tracy?
[00:22:59.680 --> 00:23:00.680] What are your thoughts here?
[00:23:01.240 --> 00:23:09.480] This is the place where if you're a founder building off of building tools and services off of AI, a lot of folks are doing that right now.
[00:23:09.480 --> 00:23:19.560] And the people, like, you know, a couple years from now, if you're thinking now about what are the risks and you're seeing these changes come down the funnel, you're seeing things become more expensive.
[00:23:19.560 --> 00:23:23.480] You're seeing, you know, maybe you're not going to have as much functionality as before.
[00:23:23.480 --> 00:23:25.960] You can't just like build right now and assume it's going to be that way forever.
[00:23:25.960 --> 00:23:35.000] So the people who are thinking about it today are the ones who are more likely to be living, still existing a couple years from now because they already have the expectations that this is going to happen.
[00:23:35.000 --> 00:23:36.280] Like it's going to get more expensive.
[00:23:36.280 --> 00:23:37.240] It's going to get harder to use.
[00:23:37.560 --> 00:23:41.640] You need to start putting the work in today to future-proof yourself.
[00:23:41.640 --> 00:23:48.280] And if you don't, like a couple years from now, it could be like, oh, wait, I built this whole business on the fact that this was really cheap and now it's gone away.
[00:23:48.280 --> 00:23:49.640] And oh, my business is over.
[00:23:49.800 --> 00:23:50.520] It's just so sad.
[00:23:51.080 --> 00:23:52.360] No, so it's like, think today.
[00:23:52.360 --> 00:24:09.400] Changes are happening so fast that pulling out like either, thinking out different ways to differentiate yourself, think out different ways of how do you get that data, how do you provide that data to your customers, what can you do to future-proof yourself is going to determine who's going to be still around in a couple years versus not.
[00:24:09.720 --> 00:24:17.960] Our next story is about Windsurf, which was a venture-backed startup that, do I need to say allegedly, allegedly screwed its employees?
[00:24:17.960 --> 00:24:27.560] But we'll link to a tweet from one of the employees who basically said, I was employee number two and I've worked on AI and code for years.
[00:24:27.560 --> 00:24:35.000] And all of my options, my vested shares, I basically got 1% of that value.
[00:24:35.000 --> 00:24:45.520] And the quick, the quick summary of this, I mean, it's kind of complicated, but OpenAI in July, early July of 2025, almost acquired Windsurf for $3 billion, but the deal fell apart.
[00:24:44.920 --> 00:24:48.480] Google swooped in and Aqua hired.
[00:24:48.720 --> 00:24:52.080] And again, I don't know if I need to say allegedly, like, you know, what are the facts here?
[00:24:52.080 --> 00:24:57.600] But Google struck a $2.4 billion arrangement, a non-exclusive license to Winsurf's technology.
[00:24:57.600 --> 00:25:07.120] They basically gutted the company, non-exclusive license, but they hired the CEO, the co-founder, all the key RD leaders, but no equity or control of the company changed hands.
[00:25:07.120 --> 00:25:14.160] And so then another company, Cognition, jumped in and acquired kind of what's left, which to me feels a bit like a shell of a company.
[00:25:14.160 --> 00:25:17.440] But early employees are like, I got nothing.
[00:25:17.440 --> 00:25:22.560] The investors and the founders, supposedly roughly half of that 2.4 billion.
[00:25:22.560 --> 00:25:25.680] So obviously 1.2 billion went to investors and founders.
[00:25:25.680 --> 00:25:30.560] And there was some money left in the company, but again, it feels kind of gutted.
[00:25:30.560 --> 00:25:40.160] And really what I want to talk about here, it's an interesting situation because it feels like the social contract of Silicon Valley was violated.
[00:25:40.160 --> 00:25:42.400] And we've heard little stories about this over the years.
[00:25:42.400 --> 00:25:49.280] I think TopTal is either an LLC or maybe they raised a safe and they never converted it to equity.
[00:25:49.280 --> 00:25:52.560] And allegedly, the founder just never converted it.
[00:25:52.560 --> 00:25:56.240] So no one actually has equity, even though they invested and is just taking money out of the company.
[00:25:56.240 --> 00:25:57.920] You know, again, allegedly.
[00:25:57.920 --> 00:26:00.800] But in this case, like people did get screwed.
[00:26:00.800 --> 00:26:13.680] And oftentimes, when I read these types of stories, because we talked about on this show on a Hot Tech Tuesday, the founders of one of these betting sites, how they like, they had sold for a billion dollars and they didn't get any money years later.
[00:26:13.680 --> 00:26:15.200] And we talked through like how that actually works.
[00:26:15.200 --> 00:26:17.760] And I mean, usually they don't report the whole story, is what happens.
[00:26:17.760 --> 00:26:25.760] It's like, yeah, they sold all their shares, or like they took secondary out early on, or on and on, or they did, or they raised hundreds of millions and that diluted themselves into nothing.
[00:26:25.760 --> 00:26:28.400] Like, there's usually more to the story than they actually report.
[00:26:28.400 --> 00:26:33.080] But in this one, I feel like the employees got screwed and this is kind of so.
[00:26:33.080 --> 00:26:35.080] Chill, you do you want to kick us off, Enar?
[00:26:35.480 --> 00:26:37.480] What are your thoughts on what happened here?
[00:26:37.560 --> 00:26:38.600] This is a precedent.
[00:26:38.600 --> 00:26:41.160] If I mean, this is dirty pool.
[00:26:41.480 --> 00:26:46.680] I mean, I have to admit, I was in shock when I first read the story.
[00:26:46.680 --> 00:26:49.320] And it's still like, it's got to be.
[00:26:49.720 --> 00:26:51.960] Is it a breach of the social contract of Silicon Valley?
[00:26:52.120 --> 00:26:53.000] Hell yeah.
[00:26:53.000 --> 00:26:54.120] Hell yeah, it is.
[00:26:54.120 --> 00:26:55.400] Like, are you kidding me?
[00:26:55.400 --> 00:26:57.480] Like, it was supposed to be a $3 billion.
[00:26:57.560 --> 00:26:59.000] Imagine if you're the employee, right?
[00:26:59.000 --> 00:27:03.720] You sit around and you're like, oh, crap, we're being bought by OpenAI for $3 billion.
[00:27:03.720 --> 00:27:05.320] Like, I've made it.
[00:27:05.320 --> 00:27:07.400] Like, I'm doing options math.
[00:27:07.400 --> 00:27:09.240] Like, what portion am I buying?
[00:27:09.400 --> 00:27:11.560] Buying the Tahoe place immediately?
[00:27:11.560 --> 00:27:13.080] Or do I wait six months?
[00:27:13.080 --> 00:27:14.760] Like, things are good.
[00:27:15.080 --> 00:27:19.560] And then you come in and you're like, oh, the deal fell apart.
[00:27:19.560 --> 00:27:27.720] And the founders and some people, like, the selected ones, are getting almost as much money or maybe more and going to Google.
[00:27:27.720 --> 00:27:37.320] And I'm behind with the unchosen ones in this company now, where Google owns like a license to what?
[00:27:37.320 --> 00:27:37.880] What?
[00:27:38.200 --> 00:27:39.560] Like, that's not cool.
[00:27:39.560 --> 00:27:57.080] I, you know, like, even if, like, you know, like, there was some, there was some argument that was saying, like, look, they put like a hundred million dollar investment into the company and like they could, this was the equivalent of like, what, the leftovers for the people, and you could have divot that out, and they chose not to, blah, blah, blah.
[00:27:57.160 --> 00:28:01.240] I'm still like, yeah, still, you got fed is what you did.
[00:28:01.240 --> 00:28:02.360] You got fed.
[00:28:02.360 --> 00:28:03.480] It is not good at all.
[00:28:03.480 --> 00:28:04.840] I think it's bad precedent.
[00:28:04.840 --> 00:28:07.880] And I hope this is not something that people start doing.
[00:28:07.880 --> 00:28:09.560] Tracy, what are your thoughts?
[00:28:09.560 --> 00:28:17.520] You know, it's as someone who has been in around Silicon Valley since, you know, for way too long, it's really funny looking, not funny, ha-ha.
[00:28:17.680 --> 00:28:23.440] You know, like back in the day in the early 2000s, and these first employees, the first employees, big things are happening.
[00:28:23.440 --> 00:28:25.920] These first employees are also becoming billionaires and whatnot.
[00:28:25.920 --> 00:28:39.440] But it feels like over time, investors and acquiring companies have figured out, this, I don't know, this might be hearsay, but it feels from this, from my perspective, that there is now a bag of tricks that they have.
[00:28:39.440 --> 00:28:57.680] You know, there's all sorts of little tweaks and little rules and lits and contractual things and things you can put into term sheets and whatnot that make it harder and harder for these first employees to get any sort of winnings from things like this.
[00:28:57.680 --> 00:28:58.480] And it does suck.
[00:28:58.480 --> 00:29:04.800] It does feel like that social contract is broken because people come into Silicon Valley and you're going to join a startup and you're going to get rich.
[00:29:04.800 --> 00:29:08.960] And it feels like it is getting harder and harder for that to actually happen.
[00:29:09.360 --> 00:29:14.880] And the only way, like you said, to join a fan company as an employee, you just make way more money.
[00:29:14.880 --> 00:29:16.560] You don't have to worry about any of this.
[00:29:16.560 --> 00:29:20.080] Or you're like startup hopping and increasing your salary that way.
[00:29:20.080 --> 00:29:28.880] But you can't, it's hard to expect or you can't really expect anymore to have this big payout from your startup that you joined as employee number one.
[00:29:28.880 --> 00:29:31.920] And now the founders are making bank.
[00:29:32.240 --> 00:29:33.840] Well, so I don't disagree with that.
[00:29:33.840 --> 00:29:35.920] I mean, like, to a degree, I don't disagree with it.
[00:29:35.920 --> 00:29:42.320] I think there's always like, look, there's a difference between a company sort of not going all that well, like a soft landing type stuff.
[00:29:42.320 --> 00:29:47.280] And in those types of situations, yeah, quite often it's like the founders get taken care of.
[00:29:47.280 --> 00:30:00.200] Like the founders end up with some sort of a package because their buddies at Google Acqui-hired them and they're now worth, look, they got, you know, not billions, but multiple millions or even 10 million over time because they stayed employee.
[00:30:00.200 --> 00:30:02.760] Versus, like, yeah, in that case, common shareholders are wiped out.
[00:30:02.760 --> 00:30:04.440] And it's just like, look, you got a job.
[00:29:59.920 --> 00:30:04.840] Good luck.
[00:30:05.160 --> 00:30:10.360] I mean, you're probably hearing all the stuff about everyone fetching about Phil's in the Silicon Valley, right?
[00:30:10.360 --> 00:30:11.160] Yeah, yeah, exactly.
[00:30:12.040 --> 00:30:15.640] Everyone, all the early employees are like, no, but like, Phil's was an example.
[00:30:15.800 --> 00:30:16.760] They were not doing well, right?
[00:30:16.760 --> 00:30:23.000] When you looked at their actual numbers, and so they just got bought by PE, and everyone underneath was wiped out.
[00:30:23.560 --> 00:30:50.280] Yeah, and there's numbers of stories, and there's all sorts of things, but it's like to a degree, you know, there are other scenarios too, where like, you know, you know, you see this often with like private equity type buyouts and more acqui-hire type buyers, whereas it's like, actually, actually, the investors are getting screwed, where it's like they come along and they like say, look, we're going to buy this company for 50 or 100 million dollars, but actually, like, we're going to pay $3 million for the equity or 1X on the equity, and the rest is carved out for retention.
[00:30:50.280 --> 00:30:51.720] That happens too.
[00:30:51.720 --> 00:30:56.760] And really, like, what it boils down to is like, it's about what the founders decide to fight for.
[00:30:56.760 --> 00:30:58.040] That's what it's about.
[00:30:58.360 --> 00:30:59.880] And that's just what it is.
[00:30:59.880 --> 00:31:05.480] You know, my sort of view on this is like, yeah, does it reflect badly on the wind source founder?
[00:31:05.480 --> 00:31:06.760] Yeah, I think it does.
[00:31:06.760 --> 00:31:09.080] I think it reflects poorly on Google's part too.
[00:31:09.080 --> 00:31:21.480] And I think it was DeepMind, which is, I don't know if that's a company or a department within Google, but I think anytime employees, one of the great things, look, Silicon Valley has done a lot of good things and a lot of not good things.
[00:31:21.480 --> 00:31:29.320] And I think one of the good things it has done is allowed folks, whether you're a founder or whether you're an early employee, to participate in upside of companies.
[00:31:29.320 --> 00:31:31.720] And there's a lot of places in the world where that doesn't exist.
[00:31:31.720 --> 00:31:32.960] There's a lot of places in the U.S.
[00:31:33.000 --> 00:31:34.360] where that doesn't exist.
[00:31:34.360 --> 00:31:40.600] And I think Silicon Valley has tended in general over time to do a decent job of that.
[00:31:40.600 --> 00:31:42.440] Not perfect, but this is.
[00:31:42.440 --> 00:31:44.960] And I think it's a real problem.
[00:31:45.440 --> 00:31:47.120] I think that's because that's the big difference.
[00:31:47.120 --> 00:31:49.360] Like, this is not like an acquires.
[00:31:44.600 --> 00:31:50.080] Well, it's on paper.
[00:31:50.160 --> 00:31:54.160] It's an Aqua Hire, mostly because of regulations around who can buy what.
[00:31:54.160 --> 00:31:55.200] But it's like.
[00:31:55.520 --> 00:32:01.040] The original deal, which was like a successful, holy crap, that's a big number deal was $3 billion.
[00:32:01.040 --> 00:32:02.720] The Google Aqua Hire is 2.4.
[00:32:02.960 --> 00:32:04.960] That's ballpark's same number.
[00:32:04.960 --> 00:32:11.840] So like, you got screwed as if this was a soft landing for the founders, but it was a massive payday for the founders.
[00:32:11.840 --> 00:32:13.040] That's the big difference.
[00:32:13.040 --> 00:32:13.680] Yep.
[00:32:13.680 --> 00:32:25.200] And I think, you know, there are only, I mean, with the middle class declining in the U.S., which is a major problem, there are only so many opportunities, you know, for quote unquote the American dream.
[00:32:25.200 --> 00:32:29.360] Like, does it still exist the way it did 60 years ago, 70 years ago?
[00:32:29.520 --> 00:32:30.800] I don't think so.
[00:32:30.800 --> 00:32:32.880] And I think that that's a problem.
[00:32:32.880 --> 00:32:34.320] I'm going to disagree on here.
[00:32:34.320 --> 00:32:44.320] I actually think one of the interesting stats about this declining middle class stuff is like, is actually, yeah, the middle class is declining, but that's partly because the upper class is expanding.
[00:32:44.320 --> 00:32:45.120] Is it?
[00:32:45.440 --> 00:32:45.840] Yeah, yeah.
[00:32:46.240 --> 00:32:50.320] So it's like, look, the middle class is declining, but also because people are getting richer.
[00:32:50.720 --> 00:32:54.320] A smaller percentage of people are getting richer while a larger percentage are getting part of it.
[00:32:54.560 --> 00:32:56.960] But I think a larger, yeah, I think people struggle.
[00:32:57.200 --> 00:33:03.040] When I go to get gas or buy avocados, I'm like, how do people, if I was making 20 bucks an hour, how the f would I afford this stuff?
[00:33:03.040 --> 00:33:04.880] It's crazy expensive.
[00:33:04.880 --> 00:33:06.240] Yeah, that's true, yes.
[00:33:06.240 --> 00:33:08.880] But good for those people who are getting rich.
[00:33:09.760 --> 00:33:10.240] Yeah.
[00:33:10.480 --> 00:33:11.120] I mean, you know.
[00:33:11.360 --> 00:33:12.080] Billions of dollars.
[00:33:12.240 --> 00:33:21.280] I mean, one of the craziest things about me lately, like, what are the truly like, holy crap, bananas thing for me is like, is what Zuck's doing with poaching AI researchers.
[00:33:21.280 --> 00:33:22.400] Like, insane.
[00:33:22.400 --> 00:33:23.280] People are getting paid.
[00:33:23.280 --> 00:33:26.160] Like, if that's true, like, I started reading these numbers.
[00:33:26.400 --> 00:33:26.960] This is b.
[00:33:26.360 --> 00:33:28.080] Like, this is like no one's getting paid out of the marketing.
[00:33:28.240 --> 00:33:29.120] What are the numbers?
[00:33:29.680 --> 00:33:40.680] I thought it was $100 million were like the complete package, including equity and salary and all this stuff, to like for multiple AI researchers as a salary to employ.
[00:33:41.080 --> 00:33:43.800] Yeah, it's unprecedented.
[00:33:43.880 --> 00:33:50.840] Hey, Zach, if you're listening to this, I would totally leave behind Tiny Seed for a $100 million package.
[00:33:51.080 --> 00:33:57.880] If you need a podcaster and a voice of AI, really opinionated Norwegian.
[00:33:57.880 --> 00:34:00.600] I actually, I think I'd probably do it for like 99.
[00:34:00.760 --> 00:34:02.280] Yeah, give you a discount.
[00:34:02.280 --> 00:34:03.080] Oh my gosh.
[00:34:03.080 --> 00:34:03.560] Yeah.
[00:34:03.560 --> 00:34:04.760] A special discount.
[00:34:05.160 --> 00:34:05.400] Yeah.
[00:34:05.400 --> 00:34:14.280] And to be clear about the previous story with Windsurf, I obviously, more than anyone, I don't begrudge any founder to want to change their life or to get rich building a company.
[00:34:14.280 --> 00:34:19.080] I think capitalism and democracy have done a pretty decent job of that.
[00:34:19.080 --> 00:34:21.160] And, you know, if anyone believes in it, I do.
[00:34:21.160 --> 00:34:25.400] But this in particular, these kind of edge cases where people get screwed, I think, is a real problem.
[00:34:25.400 --> 00:34:29.160] And my hope is that it's not something that continues.
[00:34:29.160 --> 00:34:42.840] Our last story of the day is more of a SAS-focused, not news of the world, but it is an article on founderlabs.io: how to sell if your user is not the buyer.
[00:34:42.840 --> 00:34:46.280] For example, your developer is your user, but the CTO is the buyer.
[00:34:46.280 --> 00:34:48.760] And this is by Nate Ritter.
[00:34:48.760 --> 00:35:01.080] And we, of course, see this with Tiny Seed companies where it's the easiest thing to do is if you are a bootstrap founder, you are patient zero, meaning you have the need, you validate it, and then you build it for you.
[00:35:01.080 --> 00:35:07.480] You know exactly who uses it, how they use it, and then you sell it to those people, and it's one user, and they make the decision.
[00:35:07.480 --> 00:35:10.520] That is the best scenario with B2B SAS.
[00:35:10.520 --> 00:35:14.280] But it's, I would say, probably the minority of SaaS that we see.
[00:35:14.280 --> 00:35:19.680] And so, at a certain point, you do have to learn how to sell to folks who are not your end user.
[00:35:19.680 --> 00:35:22.960] And Nate talks through some thoughts and best practices.
[00:35:22.960 --> 00:35:25.520] Tracy, what are your thoughts on his piece?
[00:35:25.520 --> 00:35:26.320] I love this.
[00:35:26.320 --> 00:35:29.520] I think it's, like I said, we see this a lot with Tiny Seed.
[00:35:29.520 --> 00:35:34.800] And a lot of people are like, you know, be like, why can't it be easier than this?
[00:35:34.800 --> 00:35:37.440] Like, why can't we, like, why can't these people just pay for my tool?
[00:35:37.440 --> 00:35:42.720] But they have to go through all these hoops where they have to talk to their manager and their manager has to talk to the buying team.
[00:35:42.720 --> 00:35:54.960] And like, and they're like, well, by talking to this developer that's saying how this would be such a great win, but then they, as the founder and as the owner of this tool, are divorced from being able to be in the room with those decision makers.
[00:35:54.960 --> 00:36:02.960] This article does a really good job, I think, of explaining like what you could do as the SAR founder in your product to make it easier.
[00:36:02.960 --> 00:36:20.560] A couple of things that I mentioned I thought were really great was, you know, if you think through this and you're able to build ways, tools and resources for that, the non-buyer, the person who's using your product, make them tools and resources or things that they need to convince their leadership, you should do that.
[00:36:20.560 --> 00:36:22.720] Like think through, think like three steps ahead.
[00:36:22.720 --> 00:36:33.920] You have to do a harder sales process in that way and think through what the entire sales cycle is like and see through what tools you have on your end that you can give to those folks to help them make the deal and just talk to people.
[00:36:33.920 --> 00:36:35.840] But I know those are just kind of like band-aids.
[00:36:35.840 --> 00:36:40.880] I'm sure Einar has some thoughts here as a true salesperson on this three-person team.
[00:36:40.880 --> 00:36:42.560] Enar, what do you think?
[00:36:42.560 --> 00:36:44.560] I mean, I thought it was a good article.
[00:36:44.560 --> 00:36:46.400] You know, there's a bunch of stuff to this.
[00:36:46.640 --> 00:36:51.520] I think for me, it's more like, yeah, I don't disagree on this.
[00:36:51.520 --> 00:37:08.280] And I think at a higher level, I think a lot of, particularly bootstrap founders, they struggle conceptually or culturally, or whoever you pointed out, this notion that they kind of just want to, they just, a lot of them, not all, but they kind of just want to, they just want to like make money when they're sleeping.
[00:37:08.280 --> 00:37:13.400] Like they just, they just, they don't like buying, they're not the typical enterprise software buyers.
[00:37:13.400 --> 00:37:20.840] And so they can't believe that somebody doesn't want to go to like a sales page and there's the pricing and you just click and put your credit card in.
[00:37:20.840 --> 00:37:25.880] And like, why would, why isn't all software in the entire world, why isn't everything like this?
[00:37:26.200 --> 00:37:33.080] They think it's a dark pattern to hide pricing and like call sales is like a is a dirty word and like all this stuff.
[00:37:33.240 --> 00:37:41.000] And the fact of the matter is like almost 100% of the time, let's, okay, let's call it 80, 90% of the time.
[00:37:41.000 --> 00:37:43.000] If you think you're different, you're probably not.
[00:37:43.000 --> 00:37:55.080] You can get to, I would say, 1 million, 2 million, 3 million of ARR, somewhere in there for a lot of vertical SaaS businesses with that sort of self-serve model.
[00:37:55.080 --> 00:38:03.480] Like you can differentiate the pricing and like, you know, you got to figure out, there's a lot of stuff, don't get me wrong, there's a lot of stuff you got to get right to get to that level too.
[00:38:03.480 --> 00:38:18.280] But a lot of businesses, once they get to that stage, if you want to keep increasing your growth and take from that one, two, three million up to like five, 10, 15, 20, you really need to start, you know, unlocking enterprise sales.
[00:38:18.280 --> 00:38:22.120] And like what that means is dealing with issues like this.
[00:38:22.120 --> 00:38:26.280] And the fact of the matter is, you know, a lot of founders just don't want to do that.
[00:38:26.280 --> 00:38:34.920] And actually, like, and shout out to Jason Cohen, friend of the show, who he put out a new metric here, the max MRR, that I thought was quite interesting.
[00:38:34.920 --> 00:38:42.600] And it's this notion that, like, you know, given where you're at in terms of your churn rate and like all this stuff, like, you know, how much, you know, how much new MRR are you booking?
[00:38:42.600 --> 00:38:44.200] Like, what is your exit max MRR?
[00:38:44.200 --> 00:38:48.240] Where are you gonna, where's your asymptote to pull back to an earlier point?
[00:38:44.680 --> 00:38:50.960] And the fact of the matter is, like, you can do that math earlier on.
[00:38:51.120 --> 00:39:01.920] Like, you can know, like, given where I'm at now and the kind of new MRR I'm booking and the churn rate that I've got, once you get to a certain size, you can do the math and know where you're going, where you're gonna tap out of that.
[00:39:01.920 --> 00:39:04.800] And fundamentally, you need to bend that curve.
[00:39:04.960 --> 00:39:06.560] You need to change those metrics.
[00:39:06.560 --> 00:39:11.520] And those metrics typically end like, okay, you're landing larger accounts who churn less.
[00:39:11.520 --> 00:39:13.040] That's just the way it is.
[00:39:13.040 --> 00:39:19.040] And what also pisses me off sometimes is like, I see these founders, they get to like 1 million, 2 million, whatever.
[00:39:19.040 --> 00:39:21.360] And they're like, this is like, I can't figure it out.
[00:39:21.360 --> 00:39:22.960] Or like, I just don't want to do anything more.
[00:39:22.960 --> 00:39:29.200] And like, you know, then basically, if you don't do anything more, you'll stall out and your business is worth less and less, even though you're maybe growing slightly.
[00:39:29.200 --> 00:39:33.920] Or they're like, I'm going to sell to this private equity group that comes through the door and they're like, whatever they're paying them, but I don't care.
[00:39:34.000 --> 00:39:34.960] Like, it's for good money.
[00:39:34.960 --> 00:39:53.600] And the fact of the matter is, like, this is like a lot of these groups, what they do, this is what they know how to do is like they figured out, look, if you got it zero to one and you got it to a million or two or three, and you didn't build out an enterprise sales effort, and it's clear that it could do so much, so much of the value of what you created compounds to them very quickly.
[00:39:53.600 --> 00:40:04.640] You could have spent five or six years getting it to one or two million, and then they buy it, and then two, three years later, they make 80% of the money.
[00:40:04.640 --> 00:40:13.200] Yeah, this is why when founders are listening to advice on the internet, you need to ask, what's the source and what are their goals?
[00:40:13.200 --> 00:40:18.000] And it used to be, well, we'd see all this funded, like the VC-backed advice for startup founders.
[00:40:18.000 --> 00:40:21.520] Like, you have to go after a billion-dollar market, and you know, you have to, have to, have to.
[00:40:21.520 --> 00:40:24.960] And Bootstrap founders would read this and be like, I have to go after a billion-dollar market, right?
[00:40:24.960 --> 00:40:27.440] And that's where I came in and was like, no, stop doing that.
[00:40:27.440 --> 00:40:28.800] Like, you're a bootstrapper.
[00:40:28.800 --> 00:40:31.960] You don't need all the stuff that you see venture-backed folks saying.
[00:40:32.040 --> 00:40:39.720] And that's why, if you listen to any of the big VCs, like there's usually some nuggets of truth, but there's a lot of bias towards becoming a unicorn or a decacorn, right?
[00:40:39.720 --> 00:40:42.200] And a lot of that stuff will actually break a bootstrap company.
[00:40:42.200 --> 00:40:51.160] But even within bootstrapping, I had this realization, well, it was about eight or nine years ago that there are the lifestyle bootstrappers and the ambitious bootstrappers.
[00:40:51.160 --> 00:40:52.120] This is the way I term it, right?
[00:40:52.120 --> 00:41:04.760] And lifestyle bootstrappers are more like the indie hackers, where it's like, I don't want to do stuff I don't want to do, and I just want a business that's, I will, I'm willing to limit my growth and my aspirations based on my comfort zone, how much I want to work, whatever it is.
[00:41:04.760 --> 00:41:06.440] I want my lifestyle to be number one.
[00:41:06.440 --> 00:41:09.560] I was that until about 2011, 2012.
[00:41:09.560 --> 00:41:13.720] And then when we started drip, I realized, oh, I have a tiger by the tail.
[00:41:13.720 --> 00:41:15.000] This could be life-changing.
[00:41:15.000 --> 00:41:15.800] I'm going to switch.
[00:41:15.800 --> 00:41:18.520] And I went into a different mode where I then sacrificed some things.
[00:41:18.520 --> 00:41:20.120] I work more than I should have.
[00:41:20.120 --> 00:41:27.240] I sacrificed, you know, some of my quality of life for a few years in order to get rich is basically what the, you know, the thing was.
[00:41:27.240 --> 00:41:31.560] And that's, I think, something that folks miss online.
[00:41:31.560 --> 00:41:33.160] And you'll see it on X Twitter.
[00:41:33.160 --> 00:41:36.760] You'll see an indie hacker arguing with someone who wants to build a $10 million SaaS.
[00:41:36.760 --> 00:41:40.120] And really, what you need to do are different in those two cases.
[00:41:40.120 --> 00:41:41.800] And so you can say, well, you should never do this.
[00:41:41.800 --> 00:41:42.360] You should always do this.
[00:41:42.360 --> 00:41:47.400] And it's like, well, not if I'm designing my life and my company a little differently than yours.
[00:41:47.720 --> 00:41:48.840] I wanted to pop in here.
[00:41:49.160 --> 00:41:52.840] Just kind of triggered a little thought in my head in and around sales.
[00:41:52.840 --> 00:41:57.000] That I've been doing sales for our new product in Tiny City.
[00:41:57.000 --> 00:41:57.880] We have a SaaS Institute.
[00:41:57.880 --> 00:42:03.000] We're working with 1 million plus ARR founders and bringing, like, we have the accelerator.
[00:42:03.000 --> 00:42:07.720] We have this process of the way that we grow our companies with the accelerator that we've been doing for the last six years.
[00:42:07.720 --> 00:42:16.640] We have this new product that we're, where it's like mentorship, advisors, and coaching, facilitated, coaching, facilitated masterminds in Sass Institute.
[00:42:16.640 --> 00:42:17.840] And I'm doing the sales for that.
[00:42:17.840 --> 00:42:20.080] And this is brand new to me.
[00:42:14.840 --> 00:42:21.760] As someone, I love to build products.
[00:42:21.920 --> 00:42:23.120] I haven't really done sales before.
[00:42:23.120 --> 00:42:27.680] I've been talking to founders within Tiny Seed for the last six years about how much people hate sales.
[00:42:27.680 --> 00:42:34.480] And the thing that really changed my perspective on this is that thinking about sales sounds kind of silly, but it's like sales is everywhere.
[00:42:34.480 --> 00:42:48.160] Sales is just understanding the perspective of the person you're talking to, understanding what they need, and figuring out on your end how to give it to them or how to match their needs.
[00:42:48.160 --> 00:42:56.560] And so, if you're talking to someone who's not necessarily a buyer, how do you give that person who is not the buyer the tools that they need to make this happen?
[00:42:56.560 --> 00:43:07.920] When I'm talking to someone and trying to convince them to spend a couple thousand dollars a month to join our coaching program, what can I say to reassure them of the value of that?
[00:43:07.920 --> 00:43:10.720] What do they need in order to be assured of that value?
[00:43:10.720 --> 00:43:17.120] And this kind of like has leaked into my thoughts for everything I do when I'm talking to Rob about my job.
[00:43:17.120 --> 00:43:18.240] A lot of that's sales too.
[00:43:18.240 --> 00:43:19.520] Like, what does Rob?
[00:43:19.520 --> 00:43:20.880] He's shaking his head at me.
[00:43:20.880 --> 00:43:22.080] What does Rob need, right?
[00:43:22.080 --> 00:43:24.480] What is, what is, what do Rob and Anart need?
[00:43:24.480 --> 00:43:34.880] And when I'm talking about what I'm doing in my job and when I want something, how do I phrase that in a way that I am reassuring them of the things that they need, but I'm also getting what I want?
[00:43:34.880 --> 00:43:42.880] And so when founders come to me and they say, like, oh, I don't like to do sales, I think that an interesting way of rethinking it is it's not a scary process.
[00:43:42.880 --> 00:43:45.440] It's literally, you kind of have to think one step ahead.
[00:43:45.440 --> 00:43:48.800] How do I, the person I'm talking to, how do I get them what they need?
[00:43:48.800 --> 00:43:50.160] How do I understand what they need?
[00:43:50.160 --> 00:43:51.440] And how do I get it to them?
[00:43:51.440 --> 00:43:54.160] And then sales becomes a lot easier when you think of it that way.
[00:43:54.480 --> 00:43:55.920] Tracy Osborne.
[00:43:55.920 --> 00:43:58.320] People can find you on the internet at TracyOsborne.com.
[00:43:58.320 --> 00:44:01.640] And of course, TracyOsborne.com on Blue Sky.
[00:44:01.640 --> 00:44:06.600] You're broadcasting to us live from the gift shop of a 1980s ski lodge.
[00:44:06.600 --> 00:44:09.640] Thank you so much for making the show today.
[00:44:09.640 --> 00:44:11.000] It's true.
[00:44:11.640 --> 00:44:12.920] Glad to be here.
[00:44:12.920 --> 00:44:20.920] Anar Volcet, you, of course, are hosting and arguing about the Giants on ex-Twitter at Anar Volcet.
[00:44:20.920 --> 00:44:24.040] We will link that up in the show notes.
[00:44:24.440 --> 00:44:27.080] Do not judge us by his Twitter feed.
[00:44:27.080 --> 00:44:28.200] Yeah, no, no, no.
[00:44:28.440 --> 00:44:32.920] His opinions do not reflect that of Tracy or Tiny Seed.
[00:44:33.240 --> 00:44:42.520] Einar is, of course, broadcasting from a discrete garage where he looks like he's about to hold the annual meeting of his Rotary Club.
[00:44:42.520 --> 00:44:45.480] So we will wrap it up here.
[00:44:45.480 --> 00:44:47.080] Einar, thanks so much for joining.
[00:44:47.080 --> 00:44:48.920] Sweet burns at the end.
[00:44:48.920 --> 00:44:49.800] Sweet burns.
[00:44:49.800 --> 00:44:56.920] The other joke was: I'm expecting to see your bowling trophy collection appearing just off camera inside the wood panel behind you.
[00:44:56.920 --> 00:44:59.720] All that said, you two, thanks for joining.
[00:44:59.720 --> 00:45:01.560] Yeah, glad to be here.
[00:45:01.560 --> 00:45:02.840] Thanks for having me.
[00:45:02.840 --> 00:45:05.800] Hope you enjoyed this episode of Hot Take Tuesday.
[00:45:05.800 --> 00:45:10.920] If you like this show format, please give me a shout on X Twitter.
[00:45:10.920 --> 00:45:12.280] I'm at Rob Walling.
[00:45:12.280 --> 00:45:20.040] And if you look at the StartupsPod Twitter account, you'll see a tweet with a video clip from this week's episode that I typically retweet.
[00:45:20.040 --> 00:45:22.920] Just chime in there and say, I really enjoyed this episode.
[00:45:22.920 --> 00:45:24.120] You should do more of these.
[00:45:24.120 --> 00:45:25.720] Because I don't know the last time we did one.
[00:45:25.720 --> 00:45:26.760] It was probably six months.
[00:45:26.760 --> 00:45:32.360] And I've gotten a couple requests over time to do it, but I often just kind of forget about doing them.
[00:45:32.360 --> 00:45:35.480] So, if you want to hear more of these, let me know on Twitter.
[00:45:35.480 --> 00:45:37.400] Thanks for tuning in this week and every week.
[00:45:37.400 --> 00:45:41.240] This is Rob Walling signing off from episode 792.