AI Model Stalemate, Cloud Giants Adopt Nuclear
Key Points
- OpenAI’s rumored 4.5‑model release was shelved, likely because Anthropic and Google are holding back their own upgrades, creating a “who jumps first” game‑theory stalemate that may only break when market pressure forces a next‑gen launch.
- According to current rumors, OpenAI is now planning to skip any interim release and wait for a full 5‑generation (or 5.5) model before unveiling anything new.
- Microsoft, Amazon, and Google have each signed multi‑hundred‑million‑dollar deals for small modular nuclear reactors, signalling that the growing energy appetite of AI‑focused data centers will soon be met by onsite nuclear power.
- The push for nuclear is driven by the massive power draw of Nvidia’s new Blackwell GPU architecture, which is already sold out through 2025 and powers the inference and training of the latest large language models.
Full Transcript
# AI Model Stalemate, Cloud Giants Adopt Nuclear **Source:** [https://www.youtube.com/watch?v=xsKJNw-jN7E](https://www.youtube.com/watch?v=xsKJNw-jN7E) **Duration:** 00:08:10 ## Summary - OpenAI’s rumored 4.5‑model release was shelved, likely because Anthropic and Google are holding back their own upgrades, creating a “who jumps first” game‑theory stalemate that may only break when market pressure forces a next‑gen launch. - According to current rumors, OpenAI is now planning to skip any interim release and wait for a full 5‑generation (or 5.5) model before unveiling anything new. - Microsoft, Amazon, and Google have each signed multi‑hundred‑million‑dollar deals for small modular nuclear reactors, signalling that the growing energy appetite of AI‑focused data centers will soon be met by onsite nuclear power. - The push for nuclear is driven by the massive power draw of Nvidia’s new Blackwell GPU architecture, which is already sold out through 2025 and powers the inference and training of the latest large language models. ## Sections - [00:00:00](https://www.youtube.com/watch?v=xsKJNw-jN7E&t=0s) **AI Model Release Rumors Stall** - The speaker explains that the anticipated OpenAI 4.5 model never launched because OpenAI, Anthropic, and Google are hesitating to release new models amid competitive pressure, preferring to wait for a more impactful generation. ## Full Transcript
we're going to go through five news
items from the week that you might have
missed and the first one you definitely
missed because it didn't happen and
that's interesting that is news that it
did not happen the 4.50 model for open
AI did not drop and there was a lot of
rumor that it would drop this week and
the reason why it didn't is probably
because anthropic did not drop their
model and so what we have right now is a
situation where anthropic and Google and
open AI all probably have better models
lying on the Shelf metaphorically
speaking but none of them want to
release them for fear of losing the pr
Buzz that comes from a model release
that's immediately followed by another
company releasing their model and then
claiming it's
great so everyone's sitting there like
what do we do right everyone's waiting
for the other person to drop the model
right now I don't know how that's going
to get resolved that's up to open Ai and
Google and anthropic we will see I would
expect that eventually the market
pressure is going to be high enough that
a model will drop the current rumor as
of this week is that open AI has gone
back to looking at just waiting until
the fifth generation or 550 model comes
out and they're not going to release a
halfway
model I don't know like the the
definition of halfway releases is weird
in llms anyway because how do you
measure the performance of something
that's a halfway through a generation
you haven't released yet I don't know so
the point is they didn't release it they
didn't release it because of interesting
game theory reasons where everyone
staring at everyone else waiting for
someone else to go first H and we will
just have to see how that resolves who
will jump in the water first with the
Next Generation model we shall
see number two did happen Google and
Microsoft and Amazon are now all in the
nuclear power game so we heard about
Microsoft earlier Amazon Inked a 500
something million deal for nuclear this
week Google Inked a deal for a lot of
power but they would not name a figure
which probably means it was also a lot
of money bottom line the big players in
Cloud are seeing that their energy
demands are about to get so high that
they have to have nuclear to handle
them and they need small reactors that
are close to the data centers to provide
continuous power which means that future
data centers are not just going to be
data centers they're going to have like
a small modular nuclear reactor running
the data center right
there and you might wonder well why is
that right like why do they need that
much more power well it's because the
new Chips need more power to run if you
install nvidia's new blackw chip
architecture which is what's shipping
right now which is what's booked out for
the rest of 2025 like the whole year
they've sold out of
Blackwell that's what is running all of
the model inference for these new next
gener generation training uh runs for
these new models that you know anthropic
and open Ai and Google are trying to
build well that has 208 billion
transistors per chip as opposed to the
mo the chips that were used to train
current chat GPT models they they came
in at you know only 80 billion uh so
this is a lot it's it's a much smarter
chip right it also sucks more power and
so blackw chips if you put them in a
server rack for a data center are going
to be coming in at 60 to 120 KW per rack
and most data centers just don't have
the power configuration to handle that
they top out
at like less than 50 kilowatts per rack
and if that's the case it's going to be
hard like you have to this explains the
data center revamp you have to
essentially revamp your data center to
handle the newest generation chips to
build the nice models
and so that is why when you look at
office construction figures in the US
they're doing very well because it turns
out that for reasons I don't fully
understand data centers are C
categorized as a kind of office in most
US economic reports and so the reason
why we have been building new offices is
because we've been building new data
centers and we've been building new data
centers to handle the power
consumption so this brings us to one of
the more surprising pieces of news for
the week
um
asml flopped and when I say that asml is
the Dutch company that builds the
fabrication machines that make the chips
that run our world they build them for
the runs that make these blackw chips
for NVIDIA but they build them for
everybody else too the stock flopped and
they lost something like $75 billion in
market capitalization this week and
scared everybody because they came in
about5 billion lower on their sales
guidance for 2025 than everyone had
expected and their bookings which are
like the future orders for chips right
they came in at half what people
expected now they have a big backlog
like they're working through like a 30
some billion dollar backlog for their
bookings but the fact that bookings came
in at like two and a half billion
instead of five scared the markets
briefly everybody who had exposure to AI
in their company portfolio took a bath
on the stock market like everybody got
scared it's since evened out and I'll
explain why so asml essentially said
this is not about
AI this is about the fact that certain
fabrication forges so certain places
where you make chips are delayed now
there's one that's really delayed that
is relevant to the forecast and it's the
Intel Forge in Germany that got pushed
back
and it got pushed back because Intel has
been having real trouble making chips
they are falling behind in the chip race
and it is impacting their ability to
make Capital Investments that in turn
hits
asml the other reason is that asml has
been selling fabrication units to China
and there is speculation that export
controls are going to go on that will
limit their ability to do that going
forward and so that is part of why they
slashed their guidance they said they
expect China Revenue to drop down to
about 20% of their total revenue versus
50% uh in the past year or
so okay if if if that all felt like a
snooze to you the
tldr is that the the company that makes
the machines that make the chips that
run our world came in at lower than
expected Revenue guidance which matters
because everyone has been expecting
these chip guys to do really really well
if all needing all of these chips to run
Ai and basically what asml said is it's
not about Ai and then we heard just
today from tsmc yes I know these guys
all have acronyms tsmc is the Taiwanese
company that is the most advanced chip
manufacture in the world and so the most
advanced chip architectures like
Blackwell which we just talked about
come from Taiwan and
tsmc has said that they have got so many
orders is they can barely keep up with
demand they've said that they have a
healthy 5-year Outlook 54% rise in
quarterly profit they beat
expectations these guys who are actually
building the AI chips are doing fine in
fact they're the most valuable company
listed in Asia right now and so when you
look at it that way the story to me
looks very much like AI is doing fine
until delayed fabrication unit asml
needs to adjust
expectations there you go that's the
news for the week oh one more notebook
LM dropped their actual product and you
can pass a note to the podcasters which
I will be doing check out notebook LM if
you haven't all right cheers