OpenAI Dev Day: Builder Era Begins
Key Points
- OpenAI’s recent “Dev Day” rollout wasn’t about new consumer features but a suite of developer tools—including an Apps SDK and a nascent app‑store model—designed to make ChatGPT the core compute platform for third‑party services.
- By rewarding “token‑heavy” users with plaques, OpenAI signaled its strategy to shift computing from bits‑and‑bytes to tokens, positioning itself as the future infrastructure provider for AI‑driven applications.
- This launch marks the “builder stage” of AI, meaning 2024‑2026 is a prime window for developers to create and monetize AI products, while the broader market‑ready apps that will make AI feel “real” for end users are still emerging.
- For AI leaders and adopters, the key takeaway is to evaluate partnerships based on who supplies the underlying token‑compute platform rather than just brand dominance, as the ecosystem’s success will hinge on the robustness of the developer‑first infrastructure.
Sections
- Untitled Section
- ChatGPT App Boom, Not a Lockdown - The speaker contends that although the ChatGPT ecosystem creates massive builder opportunities akin to the iPhone era, fierce competition among multiple models and startups means no single firm has secured market dominance.
- Navigating Multimodel Platform Lock‑In - The speaker debates whether developers should embrace a single‑vendor AI builder like OpenAI’s agent platform or pursue a flexible, multimodel strategy—highlighting Azure’s open‑model stance as a contrasting approach.
- Google Gemini Undercuts OpenAI Pricing - The speaker argues that Google's inexpensive, TPU‑powered Gemini creates a price floor that stops OpenAI from charging premiums, prompting enterprises to cut token usage through cheaper models and prompt‑engineering to lower their AI cloud bills.
- Future Scenarios for AI Model Platforms - The speaker outlines three possible market outcomes—a dominant OpenAI layer above commoditized models, a fragmented ecosystem where developers directly choose among various models, and a hybrid enterprise‑collaboration scenario.
- Betting on AI Futures & Builder Advice - The speaker assigns roughly 45% to industry fragmentation, 25‑30% to AI becoming the next “AWS,” and the remaining 20% to integration scenarios, then encourages developers to aggressively build across all AI platforms to capitalize on the current boom.
Full Transcript
# OpenAI Dev Day: Builder Era Begins **Source:** [https://www.youtube.com/watch?v=prODjJ9oQyM](https://www.youtube.com/watch?v=prODjJ9oQyM) **Duration:** 00:21:01 ## Summary - OpenAI’s recent “Dev Day” rollout wasn’t about new consumer features but a suite of developer tools—including an Apps SDK and a nascent app‑store model—designed to make ChatGPT the core compute platform for third‑party services. - By rewarding “token‑heavy” users with plaques, OpenAI signaled its strategy to shift computing from bits‑and‑bytes to tokens, positioning itself as the future infrastructure provider for AI‑driven applications. - This launch marks the “builder stage” of AI, meaning 2024‑2026 is a prime window for developers to create and monetize AI products, while the broader market‑ready apps that will make AI feel “real” for end users are still emerging. - For AI leaders and adopters, the key takeaway is to evaluate partnerships based on who supplies the underlying token‑compute platform rather than just brand dominance, as the ecosystem’s success will hinge on the robustness of the developer‑first infrastructure. ## Sections - [00:00:00](https://www.youtube.com/watch?v=prODjJ9oQyM&t=0s) **Untitled Section** - - [00:03:54](https://www.youtube.com/watch?v=prODjJ9oQyM&t=234s) **ChatGPT App Boom, Not a Lockdown** - The speaker contends that although the ChatGPT ecosystem creates massive builder opportunities akin to the iPhone era, fierce competition among multiple models and startups means no single firm has secured market dominance. - [00:07:42](https://www.youtube.com/watch?v=prODjJ9oQyM&t=462s) **Navigating Multimodel Platform Lock‑In** - The speaker debates whether developers should embrace a single‑vendor AI builder like OpenAI’s agent platform or pursue a flexible, multimodel strategy—highlighting Azure’s open‑model stance as a contrasting approach. - [00:11:32](https://www.youtube.com/watch?v=prODjJ9oQyM&t=692s) **Google Gemini Undercuts OpenAI Pricing** - The speaker argues that Google's inexpensive, TPU‑powered Gemini creates a price floor that stops OpenAI from charging premiums, prompting enterprises to cut token usage through cheaper models and prompt‑engineering to lower their AI cloud bills. - [00:14:41](https://www.youtube.com/watch?v=prODjJ9oQyM&t=881s) **Future Scenarios for AI Model Platforms** - The speaker outlines three possible market outcomes—a dominant OpenAI layer above commoditized models, a fragmented ecosystem where developers directly choose among various models, and a hybrid enterprise‑collaboration scenario. - [00:18:07](https://www.youtube.com/watch?v=prODjJ9oQyM&t=1087s) **Betting on AI Futures & Builder Advice** - The speaker assigns roughly 45% to industry fragmentation, 25‑30% to AI becoming the next “AWS,” and the remaining 20% to integration scenarios, then encourages developers to aggressively build across all AI platforms to capitalize on the current boom. ## Full Transcript
OpenAI, the makers of Chat GPT, launched
a ton of new products. And it wasn't
Chat GPT. It was everything else. It was
everything for developers to build the
ecosystem of AI. And I want to get into
why we should all care about that
because the public narrative is really
clear. It's, you know, Open AI is
dominating everything. They're building
everything. And who else can compete
with them? But it's a lot more
complicated than that. And I want to get
into where you have an opportunity if
you're in the AI space building. I want
to get into what you should take away if
you're in leadership and you're looking
at which AI company to go with. And I
want to get into what we the people we
the people who are just interested in AI
and trying to figure out how to use AI
in our careers should take away from
this. The first thing to think about is
that this wasn't aimed at all of us.
This was aimed at developers. And that
gives you a clue as to where AI is
today. AI is at the builder stage today.
I keep emphasizing this in my Substack.
If you are a builder, this is your year.
2026 is also your year. This is a moment
when the applications that will make AI
feel real for everybody don't yet fully
exist. And I know what you're saying.
You're saying, Nate, the irony. We are
talking about dev day and look what
openai launched. This is the application
of the future. I've seen those newspaper
articles because what they launched is
effectively a play on Apple's app store.
So, you know, Apple famously launched a
hardware device that became a platform.
Then they launched the app store and
then the app store became how they
monetize. It took a Supreme Court case
to change that actually, funny enough.
But OpenAI wants the same play. They
want to say we have as they announced
800 million weekly active users. We have
so many people. We are going to launch
an app store within our platform and
everybody will use it. And so that's
what they announced. It's called apps
SDK software development kit. Third
party apps can now integrate directly
into chat GPT. Spotify, Calendarly,
think Quickbooks. Think Zoom. All of
these apps now can have a home inside
chat GPT. So what does that mean? That
means chat GPT wants to be the computing
layer for the future. Their their basic
thesis, right? The idea is for 70 years
or however long we've had computers, we
have computed in bits and bytes and now
we compute in tokens. So if we compute
in tokens, why don't run all those
tokens through OpenAI? I don't think it
was a coincidence that they literally
gave out highly visible awards to the
people who spent the most tokens with
them. I saw some of the plaques online,
right? 10 billion tokens, a trillion
tokens. They put the names up on the
stage there with Sam Alman. They want to
tell you two things. They want to tell
you, one, they have all of the chips to
serve those tokens. And two, the future
is about computing with tokens and the
future is with them. But the thing is,
it's not as clear as they made it sound
from the stage. I'm going to give you a
couple of examples, and I want you to
start to wrestle with this. Whatever
your role in the AI ecosystem, the
questions underneath the PR are what
matters the most. So, question number
one, is the future
as
drag and drop as they are suggesting? Is
the future as app within an app as they
are suggesting? Those were two of the
major plays they made, right? I talked
about the apps SDK. This idea that you
would operate everything in this little
app window inside chat GPT or the future
is building with agents. I talked about
that where you actually can drag and
drop and put agents into a linear flow.
Now, don't get me wrong, there's a lot
of functionality there. Just like in the
iPhone moment, there are going to be
builders that turn into millionaires
overnight because they take advantage of
the app store moment for chat GPT,
because they take advantage of being
able to build agents with chat GPT when
everybody else is still trying to figure
it out. So those opportunities are
there, but that is different from saying
that they have achieved a lock on the
market and I don't think they have. And
the reason why is actually pretty
simple. developers
like the current world we have where we
have multiple models competing often
viciously to offer cheaper and cheaper
prices for tokens competing to deliver
better and better and better experiences
cursor cla codeex competing back and
forth it is great for developers
developers have never been so catered to
right uh and if you're getting into
building and you are using vibe coding
to build or vibe engineering as I heard
it Great. You also have never been so
catered to. We have billiondoll startups
competing for your attention and for
your dollar. In that world, do you
really want to lock in with open AI?
That is that is the question and that is
why this is not the same as the iPhone
moment. Think back to 2007. Well, I have
the gray hairs. I'm thinking back to
2007. iPhone was the first truly
interactive multimedia phone that had
product market fit. Before that, yes,
there were phones with screens. Yes,
there were some phones with apps. Yes,
there was Blackberry. But none of them
had
a intuitive, clean platform that anybody
could plug things into and just build.
And that is what iPhone did with the App
Store. I remember one of the first
successful apps was the the beer leveler
where you would just tilt your iPhone
screen. I swear this sounds like the
dumbest thing now, but like it was all
the rage because we didn't know that you
had like uh what whatever the
accelerometer or whatever that
determined the level. And you would just
be able to sit there and play with it
and watch the simulated sort of pixels
dash across the screen as the beer
stayed level no matter how you turn the
phone. This is what passed for
entertainment in 2007. But now
we don't have that world anymore. We
don't we have a world where Claude has
the MCP server ecosystem and has open
source that and everybody loves it and
Google's using it and OpenAI is also
using it. Anthropic is excellent at
making Excel at making PowerPoint.
They're going after work primitives.
They have cloud code. Google is out
there with really aggressive pricing per
token. And so they are going after
anyone who cares about sort of price
sensitivity for intelligence and they
have an immense hardware stack and very
strong expertise to back it up and
continue to invest. There's other
players too, right? We'll see where Meta
goes. We'll see where Grock goes. It is
not a world where there is just an
iPhone. Imagine a world where there were
five different iPhones. And that matters
because part of what made the App Store
successful was that it was the only game
in town. That's not true for Open AI. It
is not the only game in town. Now, I
will say anyone at OpenAI is going to
come back with a very reasonable
response. Nobody else has 800 million
users. Well, true, but Google has, I
don't know, 400 and change with Gemini
and growing fast. And Anthropic arguably
has the super premium segment that
iPhone traditionally represented because
people are choosing to lean in and pay
for it, which is part of what Anthropic
was leaning into in their marketing this
week. put on your thinking caps. They
had popup cafes in New York and San
Francisco. People waited in line. They
got a thinking cap. And the clear
implication was if you're a thinking
person, you're picking anthropic, right?
So, I think the book has not yet been
written on where all of this is going to
end up. But developers and builders have
an unprecedented set of choices and are
not super excited about a world where
they lock in. One of the things that
NADN can argue, the agent builder that
arguably is most affected by OpenAI's
launch of agent builder yesterday or the
day before is that and offers pre-built
connectors with a wide range of apps and
it's model agnostic. You can bring in
cloud, you can bring in OpenAI, they
don't care. Well, Open AI isn't model
agnostic, is it? Right? Like you're
bringing in Open AI's models. That's
what they want. That's why they built
it. And so I think that the the question
I have is in a world that is this
multimodel are we really going to get
excited about moving to a platform that
is trying to lock us in as builders. Now
let's say you're sitting in seuite. The
question for you is increasingly do you
listen to the developer side of the
house and leave room for your technical
teams to experiment across a range of
models and options or do you get into a
lock-in relationship with OpenAI because
you trust the brand and I have literally
sat and had conversations with leaders
wrestling with that. And I think in
light of that it is super interesting to
see Azure's strategy with multimodel
because Azure has very deliberately even
though they have an investment in open
AI right a big one they have decided to
lean in on being multimodel if you want
in Azure to get grock you can do it if
you want to get claud you can do it if
you want to get Gemini heck you can do
that too Azure is not going to pick
sides and that's very much Satcha's
platform play right like Sachi Nadella,
CEO of Microsoft, he's going to sit
there and say, as long as you're getting
Azure cloud, I don't care, right? You
can get whatever you want. And that
feels much more freeing if you're a
developer because the value is in being
able to flexibly choose. Now, I'm going
to come back and ask you why, right?
Like we have the choice like we could
say that the choice is why, but there's
another reason to this. The pace of
development in this ecosystem is so fast
that it seems irrational to most players
to lock themselves in to a single vendor
and that is what open AAI is inviting
and I think that may generate some brand
problems for them because I don't think
developers want to be locked in. I don't
think vibe coders want to be locked in
and if asked I doubt consumers do
either. I think chat GPT is winning
partly because it is the Kleenex of AI,
right? It is a brand name that has
become synonymous with generative AI and
that is part of how that long tale of
free users is using chat GPT and not
another tool. So another question I want
to get to, we've talked through the
whole developer builder thing. Fine. The
other question that I want to get to
that is underneath this larger story is
a question around liquidity of tokens.
So, for right now, there are three big
games in town if you're developing or
building. And OpenAI was out there sort
of praising folks that spent a lot of
tokens. But what's interesting is there
were some people whose names were on
that billboard saying, I spent a
trillion tokens or 100 billion tokens or
whatever it is who were saying publicly
on X afterward, I don't want to be on
that billboard. I want to spend less
tokens. My job is actually to compute
less so I am more efficient so I drive
better margins for my business. I don't
want to be on the board. Well, that's
kind of the difficulty of being in a
model maker position, isn't it? You are
in a position where you're offering
intelligence, but people know it's
metered by the token. They know they're
charged by the token. They don't want to
pay more than they have to pay. And in
that world, again, competition is
helpful because you compete on price.
And this is where I lean really into the
Google Gemini side. Google having TPUs
in their stack helps them to compete
aggressively on price. And when they can
compete aggressively on price, they can
give everyone else in the business a
price floor that they have to be honest
about. And so open AI cannot charge a
premium for their models beyond a
certain point because people will just
switch down to Gemini. And I think that
in in that sense, what we're really
talking about here is a world where
OpenAI wants you to be incentivized to
burn tokens. Model makers in general
want that. They celebrate that. That was
part of Dev Day, they talk about their
token burn rate. But any given
individual in the game wants to spend
less tokens. Every every CTO I know,
every CIO I know who I've talked to,
they don't want to spend more on tokens.
They want to spend less. It's like a
cloud bill. You want to reduce it. If
you want to reduce that bill, you're
going to go both with the cheapest token
per intelligence available and you're
also going to see if you can compress
your calls and do all the other prompt
engineering stuff to make it effective,
to make it cheap, to make it efficient.
And that also undercuts OpenAI's play
here, doesn't it? Because they want to
sort of send out the message that you're
just going to be spending more and more
tokens with them because the future of
intelligence is is compute. But at the
same time, like it's not, right? At the
same time, people are desperately and
publicly trying to cut how much they're
spending. Now, am I here to say that
like I think that undercuts the demand
story for AI? No. Because there's so
much business demand growing. Like, I
don't think that's what's going on. I
think it's more about being efficient
with your resources and recognizing that
a company that's publicly celebrating
token burn is a company that may not
always have incentives aligned with
yours. So, there's three scenarios for
how this plays out, right? I've talked
about some of the hidden stories here
off of Devday. I've talked about some of
what they released. I want to give you
three ways this plays out. Scenario one,
OpenAI wins and becomes the AWS of a AI,
right? So developers are are effectively
going to accept managed orchestration
over API access. Um, and they will just
work with Open AI. And this may be
because they have a successful CTO play
and they just get all these enterprise
deals and that's how it works. Or it may
be that they just release enough
features and they are price competitive
enough that they can keep developer
loyalty. We'll see. In that world, stuff
like the apps SDK just becomes standard
infrastructure. Developers find it easy
to build with it. They build a lot with
it. This in turn reinforces the habit
loop with consumers. Consumers start to
spend more time there. The attention
becomes more valuable. The spend becomes
more valuable from consumers and it
becomes this sort of virtuous feedback
loop. This is what OpenAI wants, right?
That's the future that they're hoping
for. Open AI can then capture platform
margins and not just the margins that
you get from inference. Because one of
the sort of underlying things here is
that if you're having to live in a world
where models are cheaper and cheaper,
you want to not just be living and dying
on the cost of inference compute. You
want to monetize the platform with
unfair economic advantage. And that's
exactly the long-term play they're going
for. And I'm not that they have the
competitive advantage to actually do it.
I think it's a bet and I don't know if
it will work. So models are going to
commoditize but open AI is basically
going to stay a layer above the
commoditization of models in that in
that world right like it doesn't matter
ultimately if you are talking to
multiple models your compute layer your
commodity layer where the developers are
where the building is would be open AI
so that's scenario one I don't know that
that is particularly likely given the
things I've talked about scenario two
fragmentation wins developers will
resist platform intermediation so
developers will want to go straight to
models and compute against them. They'll
want their pick of models. Vibe coders
and builders will want their pick of
options. AI enthusiasts will want to be
able to pick claude as well as OpenAI.
And in that world, there will not be as
much developer activity. From a consumer
perspective, there will not be as many
apps in chat GPT. and OpenAI will remain
perhaps the largest player in the game,
but they may not achieve what we would
call platform economics where they can
charge disproportionate
cash for their competitive advantage
because it's not that dominant. And so
the market looks sort of like the
database market in that world. There are
many winners. There are not there's not
sort of one dominant platform that owns
everybody. Scenario three is a little
bit more creative. Scenario three is a
world where there's some kind of
enterprise team up aside from OpenAI
that leaves OpenAI with the consumer
market but leaves them out of the very
lucrative enterprise and developer
market. That could look like a few
different ways. It could look like
Anthropic and Amazon teaming up. It
could look like Google and Amazon
teaming up. There are some plays where
where Apple teams up with Anthropic. If
you notice everybody's trying to team up
with Anthropic, that is a theme. So
we'll have to see. I think this is the
least likely scenario. Like if
fragmentation is the most likely, I
think this one requires some very
complex merger and acquisition and sort
of corporate alliances that would have
to be delicately negotiated. It would be
a world where like you have anthropic on
somebody's cloud along perhaps with
Gemini and enterprises get to choose the
best of breed. I think this is the world
that Microsoft wants to create.
Essentially, an integration world is a
world where the platform economics
remain with cloud providers and Azure
wants to be in place to pick up those
dollars and so does Google Cloud. And
so, one of the things that sort of comes
to mind for me as I think this through
is that, you know, Jasse at Amazon, the
CEO of Amazon was asked why Google Cloud
and why Azure are growing faster than
AWS right now. And he let slip that he
thinks it's because of AI. And of
course, Amazon kind of took a bath in
the markets as a result, but it's kind
of true. Google Cloud and Azure are
basically trying to play for this
integration play where they can say pull
all the models together. Enterprise can
choose best of breed. No one model maker
gets to own the relationship directly
with the enterprise. The cloud provider
owns the relationship with the
enterprise. So, we will have to see. I
think that one's somewhat less likely
because I think the power of these model
makers continues to grow, shift, evolve
as the models become more and more
capable and model makers are becoming
savvy enough that they're offering some
things directly to enterprises through
direct deals. And so it's it's a
complicated relationship and I think the
number of stars that have to align for
an integration plays a little bit more.
So if I had to be a sort of a betting
man, right, and I had to handicap this,
I would say fragmentation winning, maybe
not a coin flip, but close to a coin
flip, call it 45%. AI becoming sort of
the AWS and sort of their bet today or
yesterday or this week wins, I handicap
it at 25 to 30%, like I think maybe a
third of a chance roughly, maybe
slightly less. And then the remainder
would go to sort of that integration
play, right? Like call it maybe 20% and
change. And that's and those are the
three scenarios, right? I think that's
what we're looking at. If you are let's
let's do takeaways to close this out,
right? If you are a builder, your
takeaway
is to build aggressively into all of the
spaces you can. If you can build with
OpenAI's new app store and it takes off,
you're in a position to really do well.
If you can build with agents, you're in
a position to do really well. If you can
build with clawed code and that's
cheaper and that's more effective and
you can get more done, you're in a
position to do well. It's a builder's
paradise right now. I will then extend
that to anyone who is in sort of the AI
enthusiast category who's listening to
this. If you're passionate about AI,
which I think most of you probably are
if you're listening to this, especially
if you're listening to the end of this
video, which is a long way down, your
chance is now to differentiate yourself
from the competition. And by the way,
your competition is not some hot shot
22-year-old developer in Palo Alto. Your
competition are the non-technical folks.
And so you may feel like you don't have
the technical skills, but your ability
to persist through and say build an
agent with agent builder in OpenAI or
maybe build it with NAD if you don't
want to get locked in to to OpenAI
standard. That is already worlds better
than most folks who are dealing with AI
right now. I just did a video on AI
fluency. Being able to do that is a
whole lot better from where most folks
are. So this is also your chance to be a
builder. Even if you think you're not as
good as the coders, don't worry about
it. You you are doing great exactly
where you are. If you are in seuite, you
should be very carefully thinking about
the pros and cons of the model
investments you make. You should be I I
think prudently planning for a
multimodel world. You should not assume
despite all of the noise of Devday that
Devday means that you should bet on
OpenAI. I think OpenAI is absolutely a
player. I would argue it's probably the
biggest player at the table. I send
people to look at the new responses API
all the time. I think it's great, but I
don't think it's the only game in town.
And I think that any prudent CTO, any
prudent CEO should be leaving options
open given the pace of this AI race. So
there you go. Most people are talking
about Dev Day like it's a tour to force
victory for OpenAI. I think there's more
to the story and uh I hope you got a
little bit more of it. If you want to
dig in further, I have a whole lot more
in the article. I have a prompt for like
reading through it and thinking through
it in your way and kind of what you
need. Go have fun.