AI-Native Writing: Next Compute Leap
Key Points
- Code has evolved dramatically in just a few decades because it was built to work hand‑in‑hand with ever‑more powerful computers, whereas natural language was only later “bolted on” to technology.
- Modern software engineering practices—DevOps, CI/CD pipelines, testing and staging environments, GitHub, etc.—are recent innovations that exploit code’s computational design to dramatically improve development speed and reliability.
- While machines can now comprehend and generate natural language far better than before, they still haven’t mastered creating great literature, highlighting the gap between natural‑language understanding and true creative writing.
- Voice interfaces are essentially a revival of pre‑writing oral communication rather than a brand‑new breakthrough, but the next true compute‑native leap will be AI‑driven writing tools that are built specifically for the machine era.
Sections
- Code Evolution Outpaces Language - The speaker contrasts programming's rapid, compute‑driven evolution and tooling advancements with the slow historical change of natural language, highlighting how machines now finally grasp human language.
- AI‑Enhanced Document Production Workflow - The speaker argues that because knowledge work is linguistically complex, future automation should move beyond simple chatbots toward AI‑driven systems that treat documents like code—offering native variant generation, staged drafting, verification, and continual evolution.
- AI Model Development Pipeline - The speaker describes a multi‑stage workflow that moves a document through different AI models—drafting with 03, deep problem solving with 03 Pro, structuring with Opus 4, validation with Perplexity, and polishing with Sonnet 4—mirroring a software development pipeline and advocating this systematic approach over constantly searching for the “best” model.
Full Transcript
# AI-Native Writing: Next Compute Leap **Source:** [https://www.youtube.com/watch?v=l8Po2CyWZag](https://www.youtube.com/watch?v=l8Po2CyWZag) **Duration:** 00:10:29 ## Summary - Code has evolved dramatically in just a few decades because it was built to work hand‑in‑hand with ever‑more powerful computers, whereas natural language was only later “bolted on” to technology. - Modern software engineering practices—DevOps, CI/CD pipelines, testing and staging environments, GitHub, etc.—are recent innovations that exploit code’s computational design to dramatically improve development speed and reliability. - While machines can now comprehend and generate natural language far better than before, they still haven’t mastered creating great literature, highlighting the gap between natural‑language understanding and true creative writing. - Voice interfaces are essentially a revival of pre‑writing oral communication rather than a brand‑new breakthrough, but the next true compute‑native leap will be AI‑driven writing tools that are built specifically for the machine era. ## Sections - [00:00:00](https://www.youtube.com/watch?v=l8Po2CyWZag&t=0s) **Code Evolution Outpaces Language** - The speaker contrasts programming's rapid, compute‑driven evolution and tooling advancements with the slow historical change of natural language, highlighting how machines now finally grasp human language. - [00:04:10](https://www.youtube.com/watch?v=l8Po2CyWZag&t=250s) **AI‑Enhanced Document Production Workflow** - The speaker argues that because knowledge work is linguistically complex, future automation should move beyond simple chatbots toward AI‑driven systems that treat documents like code—offering native variant generation, staged drafting, verification, and continual evolution. - [00:07:26](https://www.youtube.com/watch?v=l8Po2CyWZag&t=446s) **AI Model Development Pipeline** - The speaker describes a multi‑stage workflow that moves a document through different AI models—drafting with 03, deep problem solving with 03 Pro, structuring with Opus 4, validation with Perplexity, and polishing with Sonnet 4—mirroring a software development pipeline and advocating this systematic approach over constantly searching for the “best” model. ## Full Transcript
Code has gone through more evolution in
the last 50 or 60 years than natural
language has gone through since it was
invented 200,000 years ago and since
writing was invented a few tens of
thousands of years ago maybe less.
The point is code is evolving fast
because code was designed to evolve with
developing uh and more powerful computer
systems and natural language has only
been bolted onto computers. It's not
really a computative technology. I want
you to think about it. Code started out
as you compose it, you stick it in, you
hope it runs. We are so far past that
and we have seen step change
improvements as we have leveraged better
code practices with more complex
compute. So now we have DevOps as a
discipline. That wasn't a thing when I
was coming up. It was only a thing after
the 2010s. We have like the idea of
having a testing environment, the idea
of having a staging environment, the
idea of having CI/CD pipelines, the idea
of GitHub. These are all innovations
that combine the power of compute with
code. And they're possible because code
was designed first and foremost to be a
language that worked well with
computers. That's why we have it. It's a
simplified language. It worked well with
traditional computers.
Fast forward now it's 2022. Machines
understand natural language. They
understand the language we have been
speaking for hundreds of thousands of
years. They understand the language we
have been writing. We for the first time
have machines that can master the
semantic technical complexity of
language. Language is much more complex.
Natural language much more complex than
computer code. It can express a much
wider range of meaning. The complexity
it can handle is much greater. Great
literature in particular is extremely
dense, highly complex, etc. Machines can
speak and understand that. Machines have
been trained on that. And yes, I'm the
first person to say machines are not yet
writing great literature. Uh I'm not
trying to make that claim here. The
point is this.
Our tools for writing on computers have
been bolted on for decades.
They've been bolted on and we've just
been tapping and writing out really the
same fundamental technology that we've
had since the beginning. And I know that
one of the hot new things in 2025 is
voice. But guess what? It can't be that
hot. It's actually going back to before
we invented writing to when we were in
oral culture. It's just voice again,
only now we have computers in the mix.
It's not actually a new innovation.
It's just going back to the way our
brains were originally wired. And it's
often easier to engage that part of the
brain because writing is a learned
technology and human brains haven't
evolved to make it truly native yet. So
far so good. But you know what is going
to happen that is truly compute align
that is a computen native innovation on
writing. We are going to have
AI native tooling for writing. I don't
mean you have a chat bar and then
suddenly everything appears. I'm going
to give you a little hint of it and I
think we can infer a lot about how white
collar work is going to go based on this
by the way because if you think about it
how much of our work right now is
document creation if you're not coding
which as we've seen is compute native
it's seen the fastest development before
AI frankly given cursor given windsurf
and others I would argue it's seen the
fastest development since AI the rest of
us we're making documents and you know
what I know there are enterprises that
have document pipelines that use AI. I'm
aware of them. I've advised on a few of
them, but those are all almost without
exception tightly complexity constrained
because they have to be given the scale.
Whereas traditional knowledge work is
actually very complexity expansive. It's
it's closer to the range of natural
language. You have to do a lot of
different things with documents. It's
part of what makes white collar work
actually quite difficult to understand
and automate. It's really complicated.
And what I'm saying here is not like a
direct path to automation A to B. What
I'm saying is that as machines
understand our language, we can finally
develop software that leverages compute
to give us more options. And this is
where I come back around and say again,
it's probably not the chatbot, even
though the chatbot is what we're using
right now. It is probably going to be
something that puts uh optionality and
leverage first and foremost. So, for
example, look at the way you can get
multiple variants easily with AI.
There's no reason that has to be a point
and a click or a type away. It can be
native and obvious. There are a few
tools that already are playing with this
idea where you just have multiple
variants of everything you write.
Another one, why don't we have the
concept of production code for writing
documents? I know we have draft and I
know we have final but we don't really
think of our software for documents as
being something that we could evolve
with AI so that we have like the right
model for drafting we have the right
verification step and staging for
checking our facts and claims and we
treat it like code in the sense that we
check it for clarity we check it for
coherence then we finally deploy it to
production
is that too technical a way of thinking
about it I don't think so because I
think you can take that same princip
principle and then at the end if you
want to dress it up and make it a fancy
report that just becomes a separate step
at the presentation layer. The core of
the context, the core of the text is
still there. Imagine how much easier it
is if you can deploy text across
multiple channels at once. That way,
it's like being able to deploy code to
multiple boxes. You would be able to
say, "Okay, so we're going to tweak this
core message. We're going to send it in
um a multivariant stream, right? We're
going to have cohort one be the
executive team, cohort two be the
marketers, cohort 3 be customer success,
and you're sending the same update, but
you're tuning it to what they need to
hear and focus on.
That's all stuff that was not possible
before because AI did not give us the
chance to understand natural language
until the popularization of large
language models. We had AI before then,
but true large language models were the
breakthrough we needed to grasp the
depth and complexity of text. And that
enables us for the very first time to
have compute platforms that actually
evolve the way we think and write and
take us beyond what we've been doing for
tens of thousands of years. I'm very
excited about it. I can't tell you how
exactly it's going to look, but I will
say I am already seeing professional AI
workers do this manually.
For example,
I am going to be moving my document flow
from chat GPT to drafting with 03
because I think 03 is a good conceptual
thinker. Maybe sometimes if it's a hard
problem, I'll go to 03 Pro and then move
from there into Opus 4 to understand and
structure the problem a little bit. You
see how I'm thinking about it? moving
from the dev environment now I'm
starting to move it into almost a pull
request merge scenario where I have to
think about the structure of what I'm
writing and whether it is congruent with
other things that I've written and other
things I'm focused on then moving it to
what I would call testing where you're
going to perplexity with that document
and you're testing whether the claims
are true and then moving for polishing
to sonnet 4 also a claude model because
sonnet 4 is an exceptional writer and
it's able to polish that text a little
bit more. I would call that, you know,
staging, getting ready for production,
whatever you want to say. The point is,
I am essentially mimicking that dev
pipeline. And I'm not the only one. Lots
of people are doing this, but I think
we've been doing it individually on our
own. And we talk about it as if it's
finding the best model for this and this
and it gives us a headache because it
feels like we have to constantly be
picking better models. I think it's more
stable and helpful to think about it as
we are not equipped yet with tools that
make writing native for AI. We could be
eventually in the meantime. This is our
best way to manually simulate it. And if
we understand the jobs as mapping sort
of loosely to the leverage that code has
been able to get through more compute I
think we're going to get farther because
to be honest it's actually not that much
of a stretch to say most knowledge work
goes through development it goes through
testing it goes through a merge process
we call that peer review and it
eventually gets to production. It's kind
of how it works. If we can figure out
how to do that with computer programs
that are small, with smaller scale
compute, we're going to be able to
figure out how to make that easier for
ourselves, with LLMs, with larger scale
compute, and with all the power of
natural language that makes knowledge
work so interesting.
So there you go. That is the thing that
has been keeping me awake at night. That
is the thing I cannot stop thinking
about.
I
am so excited
because I feel like we are standing on
the edge of a different way of writing
for the first time in a very long time.
If you're out there building on that, if
you're out there working on that, I know
a few founders who are. I'm excited for
you guys and cheering for you guys. In
the meantime, I'm going to keep
documenting how I build, how I learn,
how I think, and uh yeah, good luck out
there, guys. Cheers.