Learning Library

← Back to Library

AI Startup vs Enterprise: Core Learnings

Key Points

  • Startups and large enterprises operate under fundamentally different constraints, so the “right” AI strategy for each varies dramatically.
  • Agile “vibe‑coding” and rapid, even risky, feature releases are viable for startups because they can personally manage a small user base, whereas enterprises must prioritize compliance, data security, and stability to avoid lawsuits and contract losses.
  • High AI‑credit spending in a startup is comparable to hiring multiple full‑time developers, enabling fast experimentation toward product‑market fit, while enterprises face lengthy approval processes (e.g., for tools like GitHub Copilot) and strict governance.
  • The key to success is playing the game your customers expect: startup founders may need to adopt enterprise‑level rigor when targeting large B2B clients, and enterprises should recognize the value of faster, more experimental approaches where appropriate.
  • Understanding these divergent “rules of the sport” helps both sides set realistic expectations and informs where AI adoption will likely head throughout 2025.

Sections

Full Transcript

# AI Startup vs Enterprise: Core Learnings **Source:** [https://www.youtube.com/watch?v=ax8Oh5FCLh8](https://www.youtube.com/watch?v=ax8Oh5FCLh8) **Duration:** 00:25:28 ## Summary - Startups and large enterprises operate under fundamentally different constraints, so the “right” AI strategy for each varies dramatically. - Agile “vibe‑coding” and rapid, even risky, feature releases are viable for startups because they can personally manage a small user base, whereas enterprises must prioritize compliance, data security, and stability to avoid lawsuits and contract losses. - High AI‑credit spending in a startup is comparable to hiring multiple full‑time developers, enabling fast experimentation toward product‑market fit, while enterprises face lengthy approval processes (e.g., for tools like GitHub Copilot) and strict governance. - The key to success is playing the game your customers expect: startup founders may need to adopt enterprise‑level rigor when targeting large B2B clients, and enterprises should recognize the value of faster, more experimental approaches where appropriate. - Understanding these divergent “rules of the sport” helps both sides set realistic expectations and informs where AI adoption will likely head throughout 2025. ## Sections - [00:00:00](https://www.youtube.com/watch?v=ax8Oh5FCLh8&t=0s) **AI Startups vs Enterprises: Core Learnings** - The speaker contrasts the distinct speeds, tool stacks, and constraints of AI‑native startups and large AI enterprises, outlines six key insights drawn from both, and seeks to bridge the misconceptions separating founders and CEOs. - [00:03:05](https://www.youtube.com/watch?v=ax8Oh5FCLh8&t=185s) **Startup DNA vs Enterprise Demands** - The speaker explains how founders must pivot their product focus and speed based on target markets—adopting compliance-heavy practices for B2B clients while staying scrappy for AI‑savvy startups and consumers—illustrated by a Google ad subtly mocking Apple’s unfulfilled AI promises. - [00:06:10](https://www.youtube.com/watch?v=ax8Oh5FCLh8&t=370s) **AI‑Generated Code in Enterprises** - The speaker explains that while AI can quickly create small, well‑defined software components, large companies often lack the time to experiment with tools like lovable.dev, leading to a speed gap with startups and raising concerns about accumulating engineering debt and reducing code understandability at scale. - [00:09:16](https://www.youtube.com/watch?v=ax8Oh5FCLh8&t=556s) **AI’s Impact on Technical Debt** - The speaker explains that early‑stage startups deprioritize technical debt until scale or compliance demands it, but rapid advances in AI‑driven code refactoring will soon make fixing large codebases cheaper—though this won’t replace the broader, strategic work of senior engineers. - [00:12:35](https://www.youtube.com/watch?v=ax8Oh5FCLh8&t=755s) **Driving Enterprise AI Through Pain** - The speaker argues that large companies lack the immediate pain that drives AI adoption in startups, so leaders must highlight the looming existential risk of ignoring AI to create the urgency needed for meaningful implementation. - [00:16:06](https://www.youtube.com/watch?v=ax8Oh5FCLh8&t=966s) **AI Workflow Leverage and Resistance** - The passage explains that AI‑driven workflows have exponentially higher impact and require careful coordination at large enterprises versus startups, while previous disappointing AI experiences foster resistance, giving AI‑native newcomers a distinct career advantage. - [00:19:30](https://www.youtube.com/watch?v=ax8Oh5FCLh8&t=1170s) **AI Accelerates Startup Disruption** - The speaker outlines six AI adoption principles and warns that AI‑native startups achieve dramatically higher development velocity than traditional enterprises, creating a growing disruption risk for slower organizations. - [00:22:51](https://www.youtube.com/watch?v=ax8Oh5FCLh8&t=1371s) **Bifurcated Future of AI Coding** - The speaker argues that AI‑assisted code will dominate large enterprises while small startups will still hand‑craft most software, cautions against taking AI leaders’ timelines at face value, and urges companies to build realistic, customer‑focused solutions in this rapidly shifting, high‑stakes landscape. ## Full Transcript
0:00You know, AI native startups and AI 0:02enterprises have been moving at 0:03different speeds, getting different 0:05things done, using different tool 0:07stacks. And for the most part, when I 0:09look at the discourse online, when I 0:11talk to founders in the small startup 0:13category, when I talk to CEOs of large 0:15companies, what I hear is two entirely 0:18different worlds and frankly a lot of 0:21disappointment in what's framed as the 0:23other side. I would like to instead take 0:26the time to look at six different core 0:30learnings that have emerged from 0:32studying both startups and enterprises 0:35as they grapple with the transition to 0:37AI at the same time but under very 0:40different conditions. 0:42So this should apply to you if you are 0:44working in a large company environment 0:46but equally if you are working in a 0:49small company or even if you're just a 0:50solo founder. If you've ever wondered 0:53what is the difference between these two 0:55environments besides size and how is 0:58that impacting AI besides the really 1:00obvious one of the big companies often 1:02go slower and they get mocked for it and 1:04they shouldn't. Well, I'm here to 1:08disentangle all of that, give you the 1:10learnings and then have some reflections 1:12on where this is going in the rest of 1:162025. So, number one, let's get into 1:18those core learnings. Different 1:20constraints will create different 1:22correct answers. Startups and 1:25enterprises just aren't playing the same 1:26game. They're playing different sports 1:29with different rules effectively. 1:31When a startup founder ships a broken 1:33feature to 10 customers, well, you can 1:36personally call everyone if that breaks. 1:38And so vibe coding is very viable. When 1:42a PM ships to 10,000 healthcare comp 1:45customers, one data leak could trigger 1:48lawsuits, lose the biggest contracts 1:50that the business has, and eventually 1:52destroy the company. The standards are 1:55very different. The founder, solo 1:57founder, can rebuild the entire codebase 1:59over a weekend. The PM must maintain 2:03systems with database columns that stem 2:06back to migrations in, oh, I don't know, 2:082008. The startup founder who burns 2:12thousands or even tens of thousands of 2:15dollars a month in AI credits isn't 2:18actually being reckless. Even if a lot 2:20of those credits get used and reused 2:22across the same part of the codebase, 2:24effectively he or she is buying the 2:26equivalent of three or four developers 2:28working 24/7 2:30trying to solve the really difficult 2:32problem of product market fit. The 2:36enterprise on the other hand is going to 2:37take several months to approve GitHub 2:39co-pilot, but they have socks 2:42compliance. They have enterprise 2:43customers doing security audits. They 2:46have a board that demands very 2:47predictable quarterly results. 2:50They don't have the same game or the 2:53same rules. And my reflection for you on 2:55this is that you play the game that your 3:00customer wants you to play. 3:04And this is actually more profound than 3:05you would think because many times 3:07startup founders end up having to act 3:10more big company because they are in the 3:13B2B space and they're trying to serve 3:15larger and larger customers. And so they 3:17end up getting themselves into this 3:18world where they have to do compliance 3:20audits and enterprise and this and that. 3:21And that doesn't mean they lose their 3:23sort of fast young scrappy DNA, but they 3:26do have to start to shade more toward 3:28the enterprise space faster. On the 3:31other hand, if you are serving hungry 3:35startups and that is your primary 3:36customer base, well, yeah, you want to 3:38be as hungry as you can. If you are 3:40serving AI savvy builders, AI savvy 3:43consumers, you want to be as quick as 3:45you can and you want to actually make 3:47sure that you are iterating faster than 3:50anybody else. And finally, if you're 3:51serving consumers as a whole, which is a 3:53different demographic, 3:55you want to be in a position where you 3:57can make the change translatable and 3:59manageable to them. I saw a great ad 4:02from Google that really illustrates 4:03this. I think the ad without generating 4:07any liability for Google took a poke at 4:09Apple and basically said without using 4:12the word Apple that another big phone 4:14company had made a lot of promises about 4:17AI and hadn't kept them. 4:21which everyone knows Apple made a big 4:24deal about rolling out AI and couldn't 4:26keep it and then suggested maybe the 4:29step forward is to actually work with a 4:30phone that can keep its promises. Right? 4:33You need to make these big AI changes 4:36feel natural and feel easy for consumers 4:40and that leads to an interesting hybrid. 4:42So the first sort of demographic change 4:44is like if you're serving business 4:46customers, you're going to be pushed 4:48into a more business quarterly cadence 4:50kind of workflow just naturally and you 4:52have to fight that to go faster. On the 4:54other hand, if you're serving super AI 4:56conscious consumers or builders or solo 4:59founders or tiny startups, you're going 5:01to be pushed to go faster yourself. And 5:03then in the third category, if you're 5:05serving consumers, you are going to be 5:07pushed 5:09to make sure that you can translate the 5:12change and make it easy to understand. 5:14And that tends to lead you to a slightly 5:18more unpredictable cadence. You 5:20basically have to wait till you have the 5:21thing that you know the consumer is 5:23going to want and then double down 5:26there. Those are three different worlds. 5:28Essentially, what I'm suggesting to you 5:30is that instead of assuming your given 5:32size dictates the game and just letting 5:34it be, think about it as your customer 5:36would like you to work at an ideal pace, 5:39what is that pace? And maybe see if you 5:41can shift the constraint set in that 5:43direction. Number two, AI changes what 5:45building software is. We talk about it a 5:48lot, but I want to simplify it a little 5:50bit. AI changes building software to a 5:54conversation. 5:55I think Dan Shipper at every is a good 5:58example. He can have a conversation with 6:00Claude code that eventually becomes a 6:02feature and not just him other people 6:04who are non-engineers can have that too 6:05in his company. 6:08If you can get to a point where you can 6:10describe what you want increasingly if 6:13it is a fairly buildable smallcale piece 6:17of software the AI can do it for you. 6:19Maybe it's not cla code maybe it's 6:21codeex but it can still do it for you. 6:23Maybe it's lovable.dev. dev, right? And 6:25vibe coding sounds too good to be true 6:28until you see it work. I have had 6:30moments and I think this is actually 6:32really important to do. So this listen 6:33up if you work in a bigger company. I 6:36have had moments where I've talked with 6:37directors and above at larger companies 6:39and they just don't have time. They all 6:40have meetings. They don't have time for 6:44practicing vibe coding, for seeing how 6:46lovable works, etc. Pull up lovable.dev 6:51and show them how easy it is to build 6:52something. I have seen jaws drop in a 6:56sense. Part of the gap in AI is just 6:59knowing what you can do and having the 7:01time to try it. And so part of why there 7:05is a there there's a speed gap that 7:08people assume is true between startups 7:10and enterprises 7:11is that the startups have nothing to 7:13lose by dramatically shifting the way 7:16they build software. And the enterprise 7:18has a lot to lose. the enterprise has 7:21engineering debt and it's non-trivial. I 7:23don't want to set this up as like an 7:25obvious choice to go with a startup. As 7:27an example, if you are shift shipping AI 7:32code changes, shipping AI code lines 7:36over time, you run the risk of your 7:38codebase at scale, at enterprise scale 7:40becoming less understandable to you. 7:43That in turn generates a tremendous 7:44amount of cognitive debt, particularly 7:46for senior engineers. And so it's not as 7:50simple as saying, well, you have to get 7:51with the program and just ship more AI 7:53code and go faster. You actually have to 7:54think about how organizational size 7:57dictates different approaches to AI. I 8:00would argue in this case AI is a 8:02conversational process regardless, but 8:04it's easier to start the conversation 8:06for startups and for larger companies, 8:08it's more difficult to start that 8:10conversation. You may have elements that 8:12feel conversation that only happen after 8:15after you've done all of the 8:17requirements and compliance pieces. And 8:19that's what I see anecdotally at larger 8:21companies. They do all of those initial 8:22steps first and then the creation of the 8:24software ends up being more sort of 8:26conversational wherever they can, right? 8:28If they're using claude, if they're 8:29using cursor, other other tools. 8:32Principle number three, technical debt 8:35is increasingly optional. 8:38So I think there's a number of dynamics 8:41here. Another way I've talked about it 8:43is that the cost of technical data is 8:44going negative. 8:46Code quality at AI native startups 8:48ranges widely. If you have a very strong 8:51engineering founder on the team, it's 8:53often higher, but it doesn't have to be 8:55all that high to ship these days. There 8:57are startups where no one knows how to 9:00code and they're doing over a million 9:02dollars in ARR. And I I want to suggest 9:07that what that implies is that you could 9:09always buy your way out of technical 9:11debt with time and with good engineers. 9:14And increasingly 9:16time is not the factor. Scale is not the 9:19factor. You just need to have enough 9:21scale to be able to afford a production 9:24engineer to refactor something if you 9:26need it. But that is often so far down 9:29the road that you kind of don't care for 9:30a while. If you can hit a million 9:32dollars off of Vibe Coding, which 9:36literally the first startup that has hit 9:38a million dollars off of Lovable already 9:40exists, 9:42you're not you don't really worry about 9:44technical debt. You don't worry about 9:45whether you fully understand the 9:46codebase. You just worry about whether 9:47you're shipping for the customer. Now, 9:50if you have to pass compliance as an 9:52enterprise company, you do worry about 9:54the codebase. It's not optional to care. 9:57Technical debt can become a legal 9:59liability. 10:01And so 10:03I think that this is one of those areas 10:05where we are living in a movable 10:08constraint set and we need to understand 10:10the next six or eight months. Code is 10:12one of the spaces in AI where tech is 10:15invested so heavily in leveling up the 10:17ability of AI to agentically fix code 10:20that we are going to get massive 10:21breakthroughs in the next few months. 10:24That suggests to me that the problem of 10:26refactoring even in large code bases is 10:29going to get progressively easier. 10:32I don't think that's the same thing as 10:34saying that AI will be able to do senior 10:36engineering work because senior 10:37engineering work is a lot more than 10:38refactoring code bases. But it is 10:41something that we should keep in mind 10:42because what it suggests is that the 10:44cost of technical debt has fallen. Yes. 10:47Is probably negative for startups. Yes. 10:50but is also falling for larger companies 10:53very very quickly. 10:55This is the source of some of those big 10:56headlines you see where Amazon or others 10:58say publicly, we saved so many thousand 11:02man years by using AI to transition our 11:06code from this language to that 11:07language. And they're doing it already, 11:09right? They're going to do it even more 11:11as the ability to handle context windows 11:13gets better, as these AI agents become 11:15more proactive and able to work around 11:18problems, etc. This is a lot of the 11:20reason why chat GPT5 is agentic because 11:23it's designed to solve problems like 11:25this. So what's interesting is technical 11:27debt is becoming optional. 11:30It's becoming something you don't have 11:32to care about at a startup level. But I 11:35think the thing that I take away from 11:36that is that in a lot of ways startups 11:39were always advised not to care about 11:40tech debt in the first place. And so 11:42what's changed is that they're still 11:44told not to care about it, but now they 11:46don't really have to pay to clean up the 11:47mess either. like they don't have this 11:49huge bill that comes due at series A or 11:51series B to clean up all of their 11:52architecture in the same way because 11:54it's so much cheaper and faster to 11:55address it. Enterprises still have the 11:58compliance burden, but the cost of 12:01addressing it is rapidly improving as 12:03well. And so this is one to watch. This 12:06is one where I think we'd have a 12:08different conversation in 6 months. 12:11Number four, success starts with pain. 12:15Every AI adoption story that that you 12:17see starts with a team or a founder 12:21drowning in work that embraces AI as a 12:24band-aid. That's a very consistent 12:26pattern. It extends to the consumer, 12:28too. People don't embrace stuff unless 12:29it immediately solves a real problem. 12:32And so, when you think about or want to 12:35criticize how enterprises are adopting 12:37AI, 12:38one of the things that you need to think 12:40about is how real is the pain at the 12:42level of the team in the enterprise. 12:44Does the team feel the pain of not 12:46adopting AI? 12:48Because startups feel that pain right on 12:50the revenue line. If they're not moving 12:51fast enough, if they're not shipping 12:53fast enough, somebody doesn't get paid 12:54that month. Whereas for larger 12:57companies, where's the consequence to 13:00half-heartedly adopting chat GPT and 13:03only using it to write your emails? It's 13:05not a lot in most in most enterprises. 13:07So, success has to start with pain. And 13:11one of the things that I think that 13:12we've done a poor job of in midsize and 13:15above companies is really making it 13:16clear that the pain may not it may not 13:20be acute but it's real and is going to 13:23hit the company in a existential way in 13:27the next few years if they don't address 13:29it. I do not believe in a world where an 13:31enterprise can whistle by the graveyard 13:34and skip AI and get away with it. Almost 13:37every company is going to have to 13:39confront this and the longer they wait 13:42to confront it directly, 13:45the worse off it will be for them. And 13:47so this is a case where I think startups 13:51should be kinder to enterprises in 13:54mid-market because they should recognize 13:55that getting an entire company to feel 13:57pain is a very difficult art. Takes real 14:00leadership. Steve Jobs had it. Not 14:02everybody has that leadership. 14:04If you are a leader in a midsize or 14:07larger company, part of your job is to 14:08get your team to feel pain until they 14:11start to adopt AI. Number five, the 14:13workflow matters more than the tools. I 14:16have sat there in conference rooms and 14:18talked to larger companies and they just 14:19sort of make long faces and they're sad. 14:22They say, "Well, we don't we can't get 14:23chat GPT. We can't get Claude code. 14:27All all we can do is we can use 14:28Copilot." The workflow matters more than 14:31the tools. The workflow matters more 14:33than the tools. I have written an entire 14:36guide for how to use Copilot. It exists. 14:38It's out there. You can get it. It's 14:39right on Substack. But it's not about my 14:42guide. It's about the idea that you can 14:44integrate any AI tool into a good 14:47workflow. And if you actually use it, it 14:49will be the smartest tool out there that 14:52people don't integrate into their 14:53workflows. Part of how startups have an 14:56advantage here is because one brain can 14:59hold the whole workflow. 15:01one brain. One brain can hold the whole 15:03workflow and in a larger company that's 15:05not true. You have to have a lot of 15:07people working together to shift a 15:08workflow and coordination problems take 15:11time to solve. This is actually one of 15:12the reasons why I think that the excited 15:15guesses that AI would take a bunch of 15:17jobs in the enterprise. If you're trying 15:20to design workflows across multiple 15:22teams, the sheer fragmented knowledge is 15:24actually really, really hard to 15:26overcome. It's actually not something AI 15:28does a good job of. It's something 15:29humans do a good job of. Humans 15:32need to come together, 15:34do AI sprints, do something that helps 15:36you to figure out as a larger team how 15:40to actually build an AI first workflow 15:43that doesn't just stop at the level of 15:46the individual or stop at the level of 15:47the small team. And one of the things 15:50that I noticed about this is that 15:52startups assume that again their speed 15:55is their advantage which I suppose is 15:57true 15:59and the other guys are bad at what 16:00they're doing and that is not true. 16:03Workflow has 100 to a thousand times 16:06more leverage at a larger company. I 16:09could run workflows when I was at Amazon 16:11that would shift the way thousands and 16:13hundreds of thousands of titles worked. 16:16Well, you can't do that at a startup 16:18because you don't have the product for 16:20it. 16:21Workflow matters more than tools and 16:24workflows are higher stakes and higher 16:26leverage at bigger companies. And it is 16:28more important to get them right and it 16:30takes more people to get them right. And 16:33so you have to work together to build AI 16:35workflows at big companies in a way you 16:37don't at startups. Number six, 16:40experience tends to create resistance. 16:42This is counterintuitive. We like to 16:44think if only they could see AI 16:47everything would be good. 16:49My observation is that is not always 16:51true. People will try AI. They'll have 16:54one or two disappointing experiences. If 16:56they have a prior suspicion of AI, they 16:58will just use that suspicion and those 17:00one or two experiences as a reason to 17:02say no, I'm done. Especially if they're 17:04worried about their role or they're 17:05worried about having to change what they 17:07know. I know developers who are walking 17:09out of tech rather than deal with the AI 17:11change. they're like I want to go do 17:13something else right like they can take 17:14up a hobby something else this is an 17:16area where if you are getting started 17:20you can have an advantage in your career 17:22by being AI native from the beginning 17:24because there are enough senior people 17:26who are not ready for this kind of 17:28change and voting with their feet to say 17:31I'm done I don't have to work on this 17:32right now I'm leaving well as a junior 17:36person you can step in you can be the 17:39one that shows that experience doesn't 17:40have to create resistance Again, 17:42startups advantage here comes from 17:45having fewer people. You can handpick 17:48your people. Your people all can be 17:50people for whom experience creates 17:52optimism and forward motion. People for 17:55whom experience is a good thing. People 17:58who are AI native. At a larger company, 18:00you have to work with the people you 18:02have, many of whom have domain knowledge 18:03that is highly specialized that you 18:05can't move out. And so the problem then 18:07becomes how do we make sure that the 18:11resistance tendency is downleveled and 18:15minimized as much as possible across 18:17this large multi,000 person company. 18:19That is a different kind of hiring 18:21reality. You're looking at incentives. 18:23You're looking at team leadership. 18:25You're looking at how you can make 18:28training matter to different teams with 18:30widely different needs. You're looking 18:32at honest conversations about what it 18:33means to have career growth at a larger 18:36company with AI. And I I'm going to 18:38remind you again, 18:40the senior developers and the principal 18:43developers at large companies on the 18:46whole are higher quality engineers than 18:49most of the engineers at smaller 18:51startups that I've worked with. I've 18:52worked with both. I love working at 18:54there's a reason I go to startups. I I' 18:56I've loved startups because of how fast 18:57they've moved. But 19:00the kinds of quality engineers that you 19:03see who are lifers at these larger 19:04companies 19:06absolutely extraordinary and it is very 19:08rare to see them in the startup world 19:10unless they are founders. And so as much 19:12as startups would like to say, well, we 19:14can just pick good people and we're just 19:15amazing. Like, 19:17you know what? I doubt that most of your 19:19engineers operate at the level of the 19:22senior principal engineers at Meta or at 19:24Apple or at Amazon or at Microsoft. 19:28And so, in a sense, you have to pay the 19:30piper. If people that good want to take 19:32their time understanding AI, you have to 19:34make sure that you bring them along. And 19:36that's another difference. So, those are 19:38six principles we've gone through, 19:39right? Experience creates resistance. 19:41Workflow matters more than tools. 19:43Success starts with pain. 19:46Technical debt is becoming more and more 19:48optional. That one's really fluid. AI 19:50changes what building software is all 19:52about. It makes it a conversation. 19:54And different constraints create 19:56different correct answers. 19:59Let's step back and let's look at the 20:00future. What is what is actually 20:02happening here? Number one, the velocity 20:06gap is real and growing. I think that 20:08that's something that I emphasize a lot 20:09when I talk to leaders at midsize and 20:11above businesses because disruption risk 20:13is in some ways a function of velocity 20:15and AI enables such a velocity 20:17differential for AI native startups that 20:20traditional enterprises should be more 20:21worried. AI native startups can be 20:24orders of magnitude faster than the than 20:27the regular startups the 2010s. If you 20:29can ship a feature in an hour that would 20:31have taken two weeks in a team of 20:33engineers in 2018. 20:36Well, you are definitely going faster 20:39than startups went a decade ago and that 20:42poses more disruption risk to enterprise 20:44over the long term. These are 20:45fundamentally different realities. 20:48They're fundamentally different physics 20:50for startups. 20:52But velocity without direction is speed. 20:54And so the risk of startups shipping 20 20:56versions a week is that they don't have 20:57the discipline to learn and they are 20:59just accumulating chaos. And that is the 21:02one advantage enterprises bring. They 21:04put so much intention into a ship. If 21:06they do one solid version in 6 months, 21:08but it's actually builds on itself and 21:10it builds a flywheel of value for 21:11customers, they might still win. So the 21:14question really isn't who is right. The 21:16question is what happens in a world 21:19where startups are 10x faster than they 21:22were and AI is desperately trying to 21:25crack through in the enterprise. 21:28Will startup velocity actually disrupt 21:30enterprise reliability? Will enterprise 21:32reliability and enterprise customer 21:34distribution win? This is the question 21:36we're facing. And this is the question 21:38that like AI adoption is actually 21:41driving it in both of these 21:42environments. It really matters. Dario 21:44Amade predicted that AI would write 90% 21:47of code in 2025. Technically, 21:50that is true at some startups and it is 21:53becoming more true at enterprises and 21:55those mean two different things. At 21:57startups, I buy it. If AI writes 90 95% 22:00of your code at startups, you were going 22:02fast. The way we've talked about, if AI 22:04writes 90% of your code or 60% or 70% of 22:08your code at a large company, I think 22:09the CEO of Coinbase said that something 22:11like 60% of the code at his company was 22:13written by by AI, I immediately ask 22:17questions because it's a different kind 22:18of problem. Are you incentivizing your 22:21engineers by lines of code at that 22:23scale? Because you don't have that 22:25problem at a startup. You just want to 22:26ship the feature. But at scale where 22:28people are incentivized by goals and 22:29metrics, if you say it's lines of code, 22:32will people just be incentivized to 22:33write big bloated features to hit the 22:37lines of code goal? Anecdotally, that is 22:39absolutely happening. And so my question 22:42at a large scale is should you be even 22:45thinking in percentage terms? 22:48And then two, is Daario simplifying the 22:51world too much? And maybe what we're 22:53headed for is a bifurcated world. 22:57where you have AI assisted code 22:59composition and structuring at larger 23:01companies and I don't care or know what 23:03the percentage is. It's a meaningful 23:05percentage of code that you could say is 23:06written by AI but humans have a huge 23:08role to play in how they structure 23:10because the complexity of the system and 23:12that it's nearly 100% of code written at 23:14small startups that feels like a more 23:16realistic world. I will also call out 23:20this is continuing a trend where large 23:24model makers are making predictions that 23:26are directionally true. But when we come 23:29up to the actual deadline and we look at 23:31it, we're like h it's a little early. 23:34I think that one of the things I'm 23:36learning to do with Sam and with Dario 23:38and with other major model maker CEOs is 23:41I discount their predictions by a year 23:43or two. I I take whatever they say I'm 23:46like give it another year or two beyond 23:47what they say because they live in this 23:51world where they have better models than 23:52are out in the public. They are immersed 23:54in AI all the time. They may not have 23:55the perspective to see this wider world. 23:57So where does this leave us? We need to 23:59be 24:01building companies that reflect the 24:03constraints of the game we're actually 24:04in that focus on what customers need in 24:08a world where frankly customer 24:10appetites, customer demand is shifting 24:12really rapidly. And we need to recognize 24:16that the stakes are existential. If we 24:19as a startup, we as midsize, we as 24:21enterprise don't figure out how to 24:23meaningfully build AI into our 24:25workflows, the disruption risk is real 24:28and fast. It's like in Jurassic Park 24:31where the raptors kept getting smarter. 24:34Other businesses are like those raptors. 24:36They're keep getting smarter and keep 24:37getting more AI enabled and they're 24:39going to catch you unless you're able to 24:41actually put AI into place. And so my 24:44suggestion to you is that you should if 24:46you are a large company, look at those 24:48small company learnings. Look at the 24:51small company tool stack. See what you 24:53can learn. See what you can learn about 24:55speed. If you are a startup, if you're 24:57in the small company category, 25:00see what you can learn from the 25:01reliability, the stability, the system 25:03architecture of the big company. At 25:04least acquire some sympathy for where 25:06they're at because that may be you. 25:08particularly if you serve businesses. 25:11Both parties here have a lot to learn 25:14from one another and I sometimes feel 25:16like I'm the only one talking to both 25:18and they mostly want to throw rocks at 25:20each other. So everybody's in this 25:22together. We're all learning AI 25:24together. 25:25Let's see what we can