Learning Library

← Back to Library

AI Week: Platform Consolidation, Claude Skills, Cancer Breakthrough

Key Points

  • The AI industry is consolidating around a few dominant labs (Anthropic, OpenAI, Microsoft, Google) that are racing to own the full “agent layer,” threatening middleware firms with commoditization as platforms embed these capabilities natively.
  • Simpler, language‑driven workflows outperform heavyweight scaffolding; natural‑language iteration and minimal‑overhead approaches consistently deliver stronger results than elaborate prompt‑engineering or RAG pipelines.
  • Vertical integration is becoming a competitive priority, with companies building end‑to‑end compute stacks and investing heavily in custom silicon to secure supply‑chain sovereignty for AI workloads.
  • A paradigm shift from keyword search to conversational “discovery” is reshaping commerce and marketing, as users move toward intent‑driven interactions rather than explicit queries.
  • AI is emerging as a scientific discovery engine, exemplified by Anthropic’s Claude Skills platform for reusable agent workflows and Google’s cancer‑focused models that generate and experimentally validate novel drug hypotheses.

Full Transcript

# AI Week: Platform Consolidation, Claude Skills, Cancer Breakthrough **Source:** [https://www.youtube.com/watch?v=jv8sJHVmySg](https://www.youtube.com/watch?v=jv8sJHVmySg) **Duration:** 00:10:01 ## Summary - The AI industry is consolidating around a few dominant labs (Anthropic, OpenAI, Microsoft, Google) that are racing to own the full “agent layer,” threatening middleware firms with commoditization as platforms embed these capabilities natively. - Simpler, language‑driven workflows outperform heavyweight scaffolding; natural‑language iteration and minimal‑overhead approaches consistently deliver stronger results than elaborate prompt‑engineering or RAG pipelines. - Vertical integration is becoming a competitive priority, with companies building end‑to‑end compute stacks and investing heavily in custom silicon to secure supply‑chain sovereignty for AI workloads. - A paradigm shift from keyword search to conversational “discovery” is reshaping commerce and marketing, as users move toward intent‑driven interactions rather than explicit queries. - AI is emerging as a scientific discovery engine, exemplified by Anthropic’s Claude Skills platform for reusable agent workflows and Google’s cancer‑focused models that generate and experimentally validate novel drug hypotheses. ## Sections - [00:00:00](https://www.youtube.com/watch?v=jv8sJHVmySg&t=0s) **Key AI Strategic Trends This Week** - The speaker outlines five pivotal AI developments—platform consolidation, the superiority of simple workflows, vertical integration with custom silicon, the shift from search to conversational discovery, and AI’s emergence as a scientific discovery engine—before diving into recent stories and actionable takeaways. - [00:03:43](https://www.youtube.com/watch?v=jv8sJHVmySg&t=223s) **OpenAI Broadcom Deal, Walmart Integration** - OpenAI announced a multi‑year partnership with Broadcom for custom AI chips to meet soaring compute demand while also launching a chat‑based instant checkout with Walmart, enabling seamless product recommendations and purchases directly within the conversational interface. - [00:08:15](https://www.youtube.com/watch?v=jv8sJHVmySg&t=495s) **Advocating Accessible AI Agent Building** - The speaker warns that command‑line tools may restrict AI model education to developers, then recommends Peter Steinberger’s “Just Talk to It” for its compelling case that simple, iterative agent design can replace complex infrastructure, while acknowledging that larger, back‑office agent systems may still require more robust frameworks. ## Full Transcript
0:00I spent more than 20 hours following AI 0:02stories this week and this is what you 0:04need to know. We're going to go into 0:05strategic principles first. We're going 0:06to get to those stories second and then 0:08I'm going to give you takeaways third. 0:09So strategic principles that came out 0:11this week platform consolidation thesis 0:14is intact. The major AI labs anthropic 0:17openai, Microsoft, Google are racing to 0:19own the complete agent layer directly. 0:22Middleware and thin wrapper companies 0:23face commoditization risk as platforms 0:26are embedding agent capabilities 0:28natively. Second, simplicity continues 0:30to beat infrastructure. Most effective 0:33AI workflows avoid elaborate 0:34scaffolding. Natural language iteration 0:36outperforms complicated prompt 0:38engineering and rag systems. You can get 0:40to minimal overhead approaches that take 0:42you really, really far. Third, vertical 0:44integration is a wave. Companies are 0:47controlling full compute stacks and 0:49continuing to invest aggressively in 0:51supply sovereignty. Custom silicon is 0:53becoming the way to go with AI. Fourth, 0:56discovery versus search is a big 1:00paradigm shift and we're seeing beat 1:01after beat after beat on that week over 1:04week. Commerce and workflows are moving 1:06from explicit search queries into 1:08conversational intent discovery. Massive 1:10implications for marketers. Finally, AI 1:13is a scientific discovery engine. We're 1:15transitioning from data hypothesis and 1:18analysis hypotheticals to cancer models 1:21that demonstrate novel externally and 1:24experimentally validated scientific 1:26insights. Let's get into the stories 1:28that delivered those beats. Story number 1:31one, Anthropic launched Claude Skills. 1:33It's a reusable agent customization 1:35platform that packages instructions, 1:37scripts, resources together. I did a 1:39whole video on it. I think it's one of 1:40the biggest releases of the year. It 1:42takes the combination of manual 1:45orchestration and context and prompts 1:48out of the equation so that you can 1:50autocompose what you need to do by 1:52assembling the context on the fly. Simon 1:55Willis, who I read often and really 1:57respect, called this a bigger deal than 1:59the model context protocol server, which 2:00we all know is all over the ecosystem. I 2:03agree. I think it's a huge release. 2:05Watch for how enterprises handle 2:07permissions with this. Watch for open 2:09AIS in Google's competitive response. 2:11There will be one. Watch for whether 2:13skills become a standard abstraction for 2:15agentic workflows. My bet is yes. Story 2:17number two. Google cancer AI 2:20breakthrough. Two models demonstrated 2:22computational scientific discovery this 2:24week. Deep sematic demonstrated compet 2:27competency in analyzing cancer sequences 2:30and works across all major DNA 2:32sequencing platforms and is specifically 2:34good at analyzing mutations in cancers. 2:37Cell to sentence is a 27 billion scale 2:40parameter cell model that is designed to 2:43generate novel drug hypotheses and 2:45select successfully generated a specific 2:49drug hypothesis validated in vivo or in 2:52in a little petri dish to show that it 2:55could turn cold tumors hot or make them 2:58visible to the immune system. Look, I'm 3:00not a scientist. I'm not going to 3:01comment on the scientific impact of each 3:04particular hypothesis here. What I am 3:06going to say is that we live in a world 3:08where AI has gone from does it have 3:11hypothesis capabilities to two models in 3:1548 hours with two novel scientific 3:18breakthroughs that are validated 3:19externally. We are speeding up and that 3:23is the big takeaway I have on the 3:25science side. This is going to get 3:26faster and faster and faster from here 3:28on out. We are going to see a speed up 3:30in drug pipelines. It is not about this 3:33particular drug. It is not about this 3:35particular discovery. It is about the 3:36wave of AI innovation pushing into 3:39medical and drug discovery pipelines. 3:41It's a big deal. Story number three, 3:43OpenAI and Broadcom. Open AAI signed a 3:46multi-year collaboration for 10 3:49gigawatts of custom AI accelerators. 3:51This is Open AI saying that they cannot 3:54just depend on Nvidia, that they are 3:57buying as much NVIDIA chips as they can 3:59get and they still don't have enough. 4:02That they are going to have to buy more. 4:03they're going to go to Broadcom because 4:04that's the only way they can get compute 4:06demand met. This is not a story about I 4:09don't want Nvidia. It's a story about 4:11demand for OpenAI scaling so fast that 4:15they need to go to every chip supplier 4:17on the planet. And that's why we talk 4:19about custom silicone. Story number 4:20four, Walmart and Open AAI. Chad GPT's 4:23instant checkout is going to enable full 4:25transactional shopping within the chat 4:27interface. And Walmart is on board with 4:29one tap checkout via Stripe. This means 4:31that you can say something like meal 4:33ideas for a family of four in chat GPT 4:35and you'll get Walmart meal delivery 4:37options, specific ingredients, and be 4:39able to buy from Walmart in chat GPT. 4:42This is going to be a situation that 4:45marketers will watch very closely as we 4:46head into the holiday quarter. They're 4:48going to watch conversion rates versus 4:49Walmart. They're going to look at how we 4:52handle retailer exclusivity policies. 4:54What is Amazon's response? How do you 4:56handle privacy concerns? How do you 4:58measure intent? What are the key 5:01behavioral metrics? These this is new 5:03territory for marketers. We have an 5:05entire brand new channel that 10% of the 5:07world's population uses and it is 5:09getting unlocked for commerce. Now, 5:11story number five, Microsoft Windows 11 5:14Agentic operating system. Microsoft just 5:17keeps shipping on agent and copilot. In 5:20this case, they're shipping Hey Copilot 5:22always on activation. They're shipping 5:24what they call extended context which 5:27got a lot of push back because it also 5:29was read as a privacy violation because 5:32in a sense what they're doing is they're 5:34saying Microsoft Copilot can see your 5:36whole workstation all the time and 5:37remember everything and employees have 5:39felt like that was a violation of their 5:41privacy. That debate is going to go on. 5:43I expect Microsoft is going to win that 5:46because enterprises have an interest in 5:48using agents to drive hardware 5:50productivity and software productivity 5:51on laptops and they will push employees 5:54whether we like it or not frankly to go 5:56for it. Now obviously some folks with 5:59leverage are going to walk away and 6:00they're going to go to places that don't 6:01insist on the Microsoft ecosystem. We 6:03can have the conversation about Copilot 6:06and why Copilot hasn't felt like a 6:08cutting edge LLM in a long time. But the 6:11reality is Microsoft has customers at 6:13the enterprise level whom it wants to 6:15cut cloud deals with and everything 6:17pivots around that. The Windows deals, 6:19the teams deals, all of the productivity 6:21deals pivot around cloud. That is the 6:23money maker for the company and they 6:25think in terms of the money makers, the 6:28buyers, the enterprise customers. So 6:30that's story number five. Story number 6:31six, Nvidia DGX Spark. It is a data 6:34center class AI development desktop 6:37positioned at just under 4,000. I mean, 6:39I can get into the specs. 144 ARM grace 6:41CPU cores. It runs 100 tokens a second 6:44for 7 billion parameter models. And it 6:47is essentially a data center AI at a 6:51consumer price point. So if you ever 6:53wanted to run a data center grade LLM, 6:55you could do so from your desktop. And 6:58so this is going to democratize the 6:59availability of privacy preserving local 7:02inference developers who want to do edge 7:04deployment testing. It's going to give 7:07us a whole new compute category once 7:09it's established. It's not really a 7:10laptop. It's not a desktop. It is a 7:13local LLM compute point. I'm really 7:15curious to see where this goes next year 7:17because this could open up a whole new 7:19sort of place on the desk for compute 7:21for people who want local LLMs. And we 7:23are going to have to see the software 7:24catch up because right now this is for 7:26developers. Story number seven, Andre 7:28Carpathy's Nano Chat. He built a $100 7:32do-it-yourself chat GPT pipeline that's 7:34trainable in four hours. The point here 7:36is not this particular model. The point 7:39is that Karpathy is a brilliant 7:41innovator. He's a phenomenal educator, 7:43exopi Tesla. And what he is interested 7:46in is showing transparency around how 7:50models are trained. And so this becomes 7:52a phenomenal way for students to get 7:55exposure to AI training, to understand 7:57how AI models are built. It's something 8:00that should be in university 8:02curriculums. It's something that if you 8:04want to learn how to make models from 8:06scratch or build models, it becomes a 8:08way to start to get into emerging 8:09techniques with models very very easily. 8:13I hope this is adopted by people 8:15interested in helping others learn how 8:19AI models work. But my fear is because 8:22it is command line, because it requires 8:24technical knowledge, we are going to see 8:25this once again limited to the developer 8:27community. Okay, last but not least, I 8:30want to talk about my favorite read of 8:32the week. It's Just Talk to It by Peter 8:34Steinberger. Why should you read it? You 8:36should read it because Steinberger has a 8:40compelling case against agentic vendors. 8:43There's a lot of agent vendors out there 8:45that are selling a lot of very fancy 8:47infrastructure. And Steinberger is 8:49arguing based on his own experience 8:51building with agents that you don't need 8:53as much infrastructure as you think you 8:55do. And you should be leaning into 8:58iterative building with agents and that 9:01you should think of agent use as 9:03mirroring people management. So you 9:05should use scope judgment when you talk 9:07to them. You should think about when you 9:09time interventions. You could think 9:11about how you do course correction with 9:12people. Same with agents. Now my one 9:15critique of this article is that I think 9:17that is an excellent take from an 9:19engineering perspective. If you are 9:21looking at individual productivity and 9:22managing agents, I have more questions. 9:25If you're talking about a larger agentic 9:28framework that needs to do big back 9:31office operations, those tend to need 9:34more frameworks. And so I think I read 9:36this as a refreshing take from a very 9:38senior engineering figure on how he 9:40builds with agents. Absolutely worth a 9:43read. I think there are takeaways for 9:44how we all talk to our LLMs, even if 9:46you're not an engineer. So dig into it. 9:49And of course, if you want to see how 9:51all of this customizes for you, I've got 9:53a prompt for that. We have the the 9:54week's custom prompt to help you dig 9:56into the news and of course into uh 9:58Peter's article as well. Cheers.