Prompt Engineering Becomes the Product
Key Points
- With GPT‑4o (04‑mini) the prompt itself is becoming the deliverable, because the model’s outputs are often complete enough to require little downstream processing.
- These newer models are “agentic,” able to call tools and automate tasks (e.g., weekly competitor‑site scraping), turning a simple prompt into a programmable workflow.
- The rise of agentic LLMs makes tool integration far more accessible to everyday users than the earlier ecosystem of platforms like n8n, LangChain, or custom agents.
- The rapid adoption curve—ChatGPT projected to hit a billion users in three years, outpacing Facebook—illustrates how improved technology streamlines the product‑to‑customer‑value pipeline.
- This shift reshapes prompt engineering work: prompts must be explicitly purpose‑driven, specify expected outputs, and users need new skills and performance expectations beyond any single model release.
Sections
- Prompt Becomes the Product - The speaker explains how advanced LLMs now deliver finished outputs, turning the crafted prompt itself into a marketable deliverable, especially as models gain agentic tool‑calling capabilities.
- Prompts Becoming Primary Output - The speaker reflects on how competition among AI model creators is turning prompts into the central work product of professionals.
Full Transcript
# Prompt Engineering Becomes the Product **Source:** [https://www.youtube.com/watch?v=778I2wQQsm0](https://www.youtube.com/watch?v=778I2wQQsm0) **Duration:** 00:03:50 ## Summary - With GPT‑4o (04‑mini) the prompt itself is becoming the deliverable, because the model’s outputs are often complete enough to require little downstream processing. - These newer models are “agentic,” able to call tools and automate tasks (e.g., weekly competitor‑site scraping), turning a simple prompt into a programmable workflow. - The rise of agentic LLMs makes tool integration far more accessible to everyday users than the earlier ecosystem of platforms like n8n, LangChain, or custom agents. - The rapid adoption curve—ChatGPT projected to hit a billion users in three years, outpacing Facebook—illustrates how improved technology streamlines the product‑to‑customer‑value pipeline. - This shift reshapes prompt engineering work: prompts must be explicitly purpose‑driven, specify expected outputs, and users need new skills and performance expectations beyond any single model release. ## Sections - [00:00:00](https://www.youtube.com/watch?v=778I2wQQsm0&t=0s) **Prompt Becomes the Product** - The speaker explains how advanced LLMs now deliver finished outputs, turning the crafted prompt itself into a marketable deliverable, especially as models gain agentic tool‑calling capabilities. - [00:03:32](https://www.youtube.com/watch?v=778I2wQQsm0&t=212s) **Prompts Becoming Primary Output** - The speaker reflects on how competition among AI model creators is turning prompts into the central work product of professionals. ## Full Transcript
I want to talk about the idea of the
prompt becoming the product with 03. One
of the things I've been reflecting on is
that prompt engineering for a long time
with pre-trained models was prompt
engineering for the purpose of getting a
response that you would then use
somewhere else. But with 03 with 04 mini
high, the answers can be so complete
that in some cases in many cases you
don't need to do a lot of reprocessing
from there on out. And for for our
purposes as prompters, the prompt is
becoming our work product. And it's
worth thinking about it in those terms,
especially as you layer in the fact that
these newer models are also agentic. And
so you can tell 03 or you can tell 04
mini high go fetch my competitor
websites every week, make a scheduled
task, come back with this, this, this,
and this, and it will do it. They are
functionally agentic models with a lot
of tool calling ability.
And they're not the only ones. There are
other models out there that are agentic
as well. But I'm calling them out
because they're widely distributed. And
I'm calling them out because they are
making the underlying technology of
Agentic tool use much more transparent
to the general user than it's been
before.
Prior to the launch of real agentic in
chatbot uh conversational models, you
had to go to N8N, you had to go to
Lindy, you had to go to Langraph, you
had to go a lot of other places to get
agents really going for you and lots of
people did that and those businesses are
doing well. But the mass adoption
footprint you get with a gentic is
really interesting in this situation and
has reminded me of one of the
fundamental through lines in product
from technology to product to magical
customer value. You can trace that line
with the iPhone. You can definitely
trace it with chat GPT which is on track
to hit a billion users in 3 years which
is roughly three times faster than
Facebook.
And the thing that I'm thinking about as
I sort of meditate on that idea of
technology to product to customer value
as a through line that is simpler and
clearer with better
products. You know, if the model is
better when it's released, if it makes
that line even simpler and clearer than
it was when the last model was
out. And as I think about it, one of the
things that stands out about the release
of 03 and 04 mini high is they make this
idea of agents and tool calling simpler
and clearer than it was. And therefore,
they challenge us to prompt differently.
Our prompting becomes more of the final
product. Our prompting needs to be clear
about our purpose. Our prompting needs
to be clear about expected outputs. And
I think that imposes different kind of
work responsibilities on us. I don't
think we've thought enough about how
rumés change. I don't think we've taught
enough about how work experience
changes, how expectations of performance
change. Those are all really rich areas
where we need to figure out how to level
up. And that's much beyond a particular
model release. It doesn't matter whether
03 ultimately is successful or is
retired next week. I don't think it's
going to get retired next week. And it
doesn't matter if DeepSc drops, you
know, the next version a week after that
and it's incredible. I'm sure they
they'll do great things. The point is
there's an arms race to simplify this
incredible through line that is bringing
large language model the underlying
technology with tool use through the
product phase to magical customer value.
And as all of these model makers compete
to deliver on that ecosystem, we have to
think about how our prompts change and
how our prompts are more and more our
ultimate work product. And that's a
strange thought. I don't know about you,
but I did not expect to live in a world
where my prompts were my work product.