AI Boom: Mary Maker's Report
Key Points
- Mary Maker, famed internet trends analyst, released her first AI report in five years—a 340‑slide deep dive that the speaker highlights as a must‑read (full summary available on their Substack).
- The report shows AI adoption soaring “up and to the right,” with ChatGPT user growth rising 8× in 17 months, reaching 800 million users and generating roughly $4 billion in revenue with 20 million subscribers.
- ChatGPT achieved 365 billion annual searches in just two years—5.5 times faster than Google’s eleven‑year trajectory—demonstrating unprecedented speed of market penetration.
- Infrastructure spending is exploding: Nvidia’s GPU compute capacity grew 100× over six years, cloud‑provider capex has surged, and data‑center buildout has risen 49 % annually since 2023.
- Efficiency gains are dramatic, with energy required per LLM token dropping by a factor of 105,000 over the past decade, making today’s AI capabilities feasible.
Sections
- Mary Maker's AI Report Highlights - The speaker summarizes Mary Maker's first AI report in five years, emphasizing unprecedented user and revenue growth metrics for ChatGPT and the broader AI market.
- AI Efficiency Revolution Cuts Costs - The speaker explains that the surge in AI usage has triggered an unprecedented drop in energy and inference expenses—over a 105,000‑fold reduction in energy per token and a 99.7% cost cut for serving models in just two years—making today’s rapid advancements feasible.
- AI Model Funding vs Revenue Gap - The speaker outlines how AI model companies have raised nearly a hundred billion dollars while generating far less revenue, creating a capital overhang that raises serious sustainability and profitability questions for firms like Anthropic, OpenAI, and Google.
- AI Gold Rush Monetization - The speaker outlines a presentation likening the booming NVIDIA AI ecosystem to a gold rush, emphasizing high‑margin chip sales over low‑margin token sales and directing viewers to a Substack for a deeper dive.
Full Transcript
# AI Boom: Mary Maker's Report **Source:** [https://www.youtube.com/watch?v=SykH1k65Dy4](https://www.youtube.com/watch?v=SykH1k65Dy4) **Duration:** 00:13:00 ## Summary - Mary Maker, famed internet trends analyst, released her first AI report in five years—a 340‑slide deep dive that the speaker highlights as a must‑read (full summary available on their Substack). - The report shows AI adoption soaring “up and to the right,” with ChatGPT user growth rising 8× in 17 months, reaching 800 million users and generating roughly $4 billion in revenue with 20 million subscribers. - ChatGPT achieved 365 billion annual searches in just two years—5.5 times faster than Google’s eleven‑year trajectory—demonstrating unprecedented speed of market penetration. - Infrastructure spending is exploding: Nvidia’s GPU compute capacity grew 100× over six years, cloud‑provider capex has surged, and data‑center buildout has risen 49 % annually since 2023. - Efficiency gains are dramatic, with energy required per LLM token dropping by a factor of 105,000 over the past decade, making today’s AI capabilities feasible. ## Sections - [00:00:00](https://www.youtube.com/watch?v=SykH1k65Dy4&t=0s) **Mary Maker's AI Report Highlights** - The speaker summarizes Mary Maker's first AI report in five years, emphasizing unprecedented user and revenue growth metrics for ChatGPT and the broader AI market. - [00:03:06](https://www.youtube.com/watch?v=SykH1k65Dy4&t=186s) **AI Efficiency Revolution Cuts Costs** - The speaker explains that the surge in AI usage has triggered an unprecedented drop in energy and inference expenses—over a 105,000‑fold reduction in energy per token and a 99.7% cost cut for serving models in just two years—making today’s rapid advancements feasible. - [00:07:04](https://www.youtube.com/watch?v=SykH1k65Dy4&t=424s) **AI Model Funding vs Revenue Gap** - The speaker outlines how AI model companies have raised nearly a hundred billion dollars while generating far less revenue, creating a capital overhang that raises serious sustainability and profitability questions for firms like Anthropic, OpenAI, and Google. - [00:10:28](https://www.youtube.com/watch?v=SykH1k65Dy4&t=628s) **AI Gold Rush Monetization** - The speaker outlines a presentation likening the booming NVIDIA AI ecosystem to a gold rush, emphasizing high‑margin chip sales over low‑margin token sales and directing viewers to a Substack for a deeper dive. ## Full Transcript
So, Mary Maker wrecked my weekend, and I
mean that in the best sense. She's known
as the queen of the internet. Uh, and
she was famous for her internet trends
reports that came out annually and were
particularly insightful from the 1990s
to 2019. She was early on Google, she
was early on Amazon. She has a sterling
reputation for her analysis in the
space. This is the first report she has
dropped in five years and it's on AI.
340 slides. I'm guessing you don't want
me to go slide by slide, so I want to
call out to you some of the highlights.
There is a completely free article over
on my Substack if you'd like a full
takedown. Uh it was a lot of fun for me
to go through this weekend. I I it's
it's incredible to see the amount of due
diligence in this deck to be very honest
with you. All right, let's get right to
it. First up, we have what I call the up
into the right section. uh where Mary is
basically laying out the case that AI is
absolutely
unprecedented in the rate of growth that
it is demonstrating across all of the
traditional metrics and
software. So she calls out AI user
growth chat GPT as an indicator up 8x in
17 months which is just wild. uh so up
to 800 million and then talks about how
that translates into
revenue and these are already given the
pace of change these are already
somewhat out of date um and revenue is
up toward 4 billion now for chat GPT
subscribers are up toward uh 20 million
uh from virtually zero uh in 2022 and so
it's just again been up and to the right
and continuing to accelerate even in
2025.
Uh, and then this one is my
surprise. The time to 365 billion annual
searches or a billion searches a day.
Chat GPT got to that 5 and a half times
faster than Google got to it. And so
Mary calls out that Chat GPT hit 365
billion annual searches in 2 years
versus Google's 11 years, which is just
wild.
And now I know there's differences in
internet penetration and other things
that affect that but still the speed is
astonishing. Mary also calls out uh some
up and to the right trends in capital
expenditure and internet infrastructure.
Nvidia installed GPU computing power has
gone up 100x in six years. 100x. That
one also got me to do a double take.
There were a lot of double takes in
these slides because I knew the numbers
were big, but it was just wild. Capex
spend at the big six, I knew that was
inflecting. It's really astonishing how
much it has scaled up as a growth rate
since 2020. Like we can see the
beginning of the AI buildout in big
cloud providers in 2020.
Data center buildout also had a major
inflection point but came a little bit
later in 2023 as AI start to hit. It's
been up 49% a year since 2023 which is
insane. At the same time, there are a
few charts that are down and to the
right. So energy required per LLM token.
I I am not kidding you.
105,000x decline in energy required to
generate a token over the last decade.
If you want to look for a reason why
some of what we're experiencing today is
possible. That's it.
105,000 times cheaper to generate a
token in the last 10 years. This is off
the NVIDIA GPU set. Similarly, AI
inference costs are dropping through the
floor.
So uh the cost to serve a model is
99.7% lower over two
years. AI cost efficiency gains look
like a cliff as you would expect. And
Mary does an interesting job here on
this slide. She talks about the
difference between the light bulb and
the computer
chip and or computer memory chips in
particular and then uh the cost to
generate like a 75word response in chat
GPT and you don't have to like know the
exact number to get the general idea
that the light bulb took something close
to 75 years to drop as far in cost as
chat GPT has dropped in two years.
And that's just wild to
me. And because cost is lower, model
performance is converging. And this is
why Deep Seek's gains are not that
surprising. And so if you look at the
overall arena scores, which I know are
not perfect, but at least they're a
head-to-head comparison. Google, Open
AAI, Deepseek, they've all converged,
and they were very, very different just
a year plus ago.
And so seeing that convergence
highlights what Sam has called out which
is that we don't live in a world where
we're going to have one winner in AI. We
live in a world where there are multiple
winners in AI. There's fierce
competition. And this calls out one of
the areas where Mary and I diverge a
little bit. Mary views this fierce
competition in classically economics
terms. She thinks of it as competition
that's good for consumers. What I notice
is that consumers seem to have already
anointed a winner in chat GPT and to a
lesser extent
Gemini and I don't see a proliferation
of apps powered by these foundation
models that I would expect in a true
consumer revolution. People seem to be
leaning into the habit stack they
already have with Chad GPT.
I do think this sort of vicious
competition is going to be very good for
businessto business use cases where we
see much wider adoption across different
business use cases and lanes of these
different
models. And so I think the thing that
stands out to me that Mary doesn't
really get into in the deck is that
there's a very different future
unfolding empirically for B2B than there
is for B TOC. B TOC seems like a lottery
where you're competing with shed GPT
right now and B2B looks a lot more like
we have these individuated use cases
foundation models won't necessarily ever
cover them we need to build a particular
tool for this particular use case and
you can have a lot of winners in the in
the niches and the margins there and in
that world having lower overall model
cost and cost to serve makes a big
difference from a unit economics
perspective. Now, that gets at one of
the things that Mary calls out that
isn't
really it's not clear how this gap gets
fixed.
Fundamentally, AI model companies have
raised something close to hundred
billion. Mary pegs it at 95 billion and
they only cleared about 11 billion in
annualized revenue. Now, that number is
rising really fast as these model makers
start to scale. I think Anthropic
literally is off the charts right now
because Mary pegs them at 2 billion and
I recently heard three billion as an
annualized rate. So they're really
exploding, particularly since the Claude
4 launch a couple of weeks ago, but the
overall picture remains the same.
They've raised about 10 times more than
they've delivered in annualized revenue.
And there's a tremendous capital
overhang there. And that
means that means a big question mark
around how we resolve that funding
discrepancy. Because if you have a
capital overhang, you have vicious
competition, you have cost to serve
going down, there's tremendous margin
pressure on uh token utility and cost
per token. At the end of the day, you've
got something that you are selling
that's depreciating really fast. That's
tokens. And it costs a lot to make a new
model. And I don't know how you clear
money on that long term. And I think
that's one of the interesting question
marks for Anthropic, for OpenAI, for
Google. And of those, Google obviously
has the deepest pockets and can sustain
this the longest. And that may be part
of their
strategy. But at the end of the day,
this is a real discrepancy. And and the
bill is going to have to be paid at some
point. I remember when Uber was dirt
cheap and everyone was taking $2 rides
here and there. Well, now they're $20.
Now they're $25 rides. And so part of
how Uber closed their profitability gap
was they started charging the economic
price. And I do wonder if at some point
model makers are going to close this
revenue gap by substantially raising
prices and we will have to see if
they're able to retain users in that
scenario. All right, moving along a
little bit. I think one of the other
takeaways I had is that AI agent
interest is up as much as people think.
It's up
a,088% over the last 16 months if you
look at Google search trends. But, and
this is again where I would sort of add
a little nuance to Mary's
take, I do not think that we are seeing
very many practical use cases of agents
outside of very large companies that
have strong LLM engineering teams or
very tidy pre-built agents that do very
narrow things. Those are the two use
cases where I see wins. And there's a
big messy middle in the mid-market where
companies have custom needs, but they
don't have the capital to get strong AI
engineering talent. And their needs are
too custom for the pre-built stuff. And
what do they do? There's not really a
great answer to that right now. Uh, and
I think that that's an area where a
little bit more nuance is helpful in
terms of understanding what's really
going
on. All right, we're going to skip over
some of this. I know we've already gone
on a long way. One thing I do want to
call out is that it's not just you
imagining everyone talking about the AI
hype. Uh the proportion of S&P 500 firms
mentioning AI during their quarterly
earnings calls is now over 50%. It has
skyrocketed from 10% in just a year,
year and a half maybe. Absolutely wild.
And there has been a doubling in
developers and startups and apps in the
NVIDIA AI ecosystem to
serve all of those companies. And so in
a sense, we're in the middle of a gold
rush. And she actually names it. Mary
Mary calls out sort of that famous
venture capital analogy of selling picks
and shovels in the gold rush. And uh
there's a whole run of about 10 or 15
slides where she does nothing but talk
about like the companies that are
selling chips and monetizing really
effectively. a lot of her case is that
selling tokens has it's a low margin
business but selling chips is a high
margin business so she likes Google for
their TPUs she likes uh obviously Nvidia
and kind of how they've been able to
manage their business and we will have
to see how the major model makers handle
their monetization strategy okay we've
gone through a lot of the deck but I've
gone through it at a very high level and
I know this may seem like a long video
but I promise you reading 340 slides is
longer. If you want to dive deeper on
this, uh, feel free to grab my Substack.
I'll link it here. It is a full readout.
Still much shorter than the deck, but
you get a sense of what she did. You can
check out all of those charts. I chose
not to scroll through it because
whenever I do that, I never get views on
those videos. So, you guys seem to like
my face, which is kind of weird, but
here we are. And you can also see my
take on the deck. And you can see a
little bit more about the taker on the
internet. I include sort of perspectives
from Axios, perspectives from other
places where Mary has given interviews.
Uh, and we sort of get an overall
picture of this deck. Is it worth it?
Yes. This is probably the deck that will
be most influential to how capital
allocators, VCs, investors think about
AI for the rest of this year. It is
absolutely worth this degree of
attention. And if you're not an
investor, it's worth it to you because
this shapes how the investors who drive
companies, drive job
creation are going to be thinking about
this stuff. And that affects all of us.
Whether the job opens up or not is the
function of whether the startup funding
is there. And if the startup funding is
there, it might be because Mary Mer made
a recommendation in this deck very
bluntly. And so I want to make sure
everybody is aware of this. I've made
this completely free so that everybody
can dive in and look at it. Uh, and I've
obviously linked to the deck so that you
can sort of see the full thing if you
want to. And that's where I'll leave it.
It was a lot of fun for me to go through
all 340 pages of the deck. I know it's
not fun for everybody, but I'm a nerd
like that. Hope you enjoy this summer.