It’s appropriate that this week is Thanksgiving, because Anthropic just dropped the best coding model we’ve ever used: Claude Opus 4.5.
We’ve been testing Opus 4.5 over the last few days on everything from vibe coded iOS apps to production codebases. It manages to be both great at planning—producing readable, intuitive, and user-focused plans—and coding. It’s highly technical and also human. We haven’t been this enthusiastic about a coding model since Anthropic’s Sonnet 3.5 dropped in June 2024.
The most significant thing about Opus 4.5 is that it extends the horizon of what you can realistically vibe code. The current generation of new models—Anthropic’s Sonnet 4.5, Google’s Gemini 3, or OpenAI’s Codex Max 5.1—can all competently build a minimum viable product in one shot, or fix a highly technical bug autonomously. But eventually, if you kept pushing them to vibe code more, they’d start to trip over their own feet: The code would be convoluted and contradictory, and you’d get stuck in endless bugs. We have not found that limit yet with Opus 4.5—it seems to be able to vibe code forever. — Read More
Recent Updates Page 6
Producer Theory
… One [theory] is that integrating all of this data together is extremely valuable, and that the rush to do it—according to The Information, every major enterprise software company is building an “enterprise search” agent—is a very sensible war for a very strategic space. Google became the fourth biggest company in the world by being the front door for the internet; of course everyone wants to be the front door for work. And this messiness is just an intermediate state, until someone wins or we all run out of money.
A second theory, however, is that platforms aren’t as valuable as we think they are. For a decade now, Silicon Valley has come to accept, nearly as a matter of law, that the aggregators are the internet’s biggest winners. But aggregation theory5 assumes that production flows cleanly from left to right: From producers, to distributors, to consumers, with the potential for gatekeepers along the way. “Context”—especially if MCP succeeds in making it easy for one tool to talk to another—is not like that. Slack aggregates what Notion knows; Notion aggregates what Slack knows; ChatGPT aggregates what everyone knows, and everyone uses ChatGPT to aggregate everything. Producers are consumers, consumers become producers, and everyone is a distributor. There aren’t people orderly walking into one big front door; there are agents crisscrossing through hundreds of side doors.
In that telling, the right analogy for context isn’t content, but knowledge. — Read More
Meet the new Chinese vibe coding app that’s so popular, one of its tools crashed
A Chinese vibe coding tool went viral so fast that its signature feature crashed just days after it launched.
LingGuang, an AI app for vibe coding and building apps using plain-language prompts, launched last Tuesday and reached over 1 million downloads in four days. By Monday, the app had crossed 2 million downloads, said Chinese tech group Ant Group, which built the AI coding assistant tool.
On Monday, LingGuang ranked first on Apple’s mainland China App Store for free utilities apps and sixth overall for free apps. — Read More
How Google Pulled Off Its Stunning, Rapid-Fire AI Turnaround
Google came into 2025 with its AI stumbles looming large. The company’s slow start to the generative AI race turned borderline catastrophic in 2024 when its products generated images of diverse Nazis, told users to eat rocks, and couldn’t match OpenAI’s shine. AI chat was seen as a major threat to search, and outsiders didn’t see a coherent strategy. In January, Google stock was on the sale rack and murmurs about CEO Sundar Pichai’s job security floated around the internet.
We’re not quite in December and Google has masterfully reversed course. Its AI models are world class. Its products are buzzy again. Its cloud business is booming. And search is stronger than ever. Its stock is up 56% this year and, at $3.59 trillion, it just surpassed Microsoft’s market cap. Now, no serious person would question Pichai’s job status. – Read More
AI Models Are Becoming a Commodity: Are You Ready for the 5 Second-Order Effects Reshaping Industries by 2026?
As AI models become as common as electricity, the real competitive advantage shifts. Discover the five critical second-order effects of AI commoditization and learn how industries are preparing for a transformed business landscape in 2026.
For the last several years, the conversation around artificial intelligence has been dominated by a narrative of scarcity and exclusive power. Having access to a state-of-the-art AI model was a golden ticket, a competitive moat that only a handful of tech behemoths could afford to build. That era is rapidly coming to a close. We are now entering the age of AI commoditization, where powerful models are becoming a standardized, widely accessible utility—much like cloud computing or electricity before them
This seismic shift is being accelerated by fierce market competition, the proliferation of high-performance open-source models, and aggressive pricing from major cloud providers. The first-order effects are already visible and dramatic. We’re seeing a race to the bottom on pricing, with some analyses showing that the cost of using top-tier models dropped by over 80% in just one year. This democratization of access is just the beginning. — Read More
A tsunami of COGS
The AI industry is in correction mode. Last week Nvidia reported its earnings and the world was holding its breath. If they miss, it is so over. If they crush it, we are so back. In the end, earnings beat expectations, but the stock slid anyway after an initial bump. Many things in the AI boom smell bad. The way money fuels the investment spree is quite questionable and it has become a meme, with the same $1T investment check moving hands among a small set of participants. We can call this “vendor financing”, but it is not a great look.
In my opinion the players more at risk here are the hyperscalers like Microsoft, Amazon and Oracle, and the neocloud players like Nebius and CoreWeave. They are in between the providers of chips like Nvidia and the buyers of compute like OpenAI. They really have no choice other than buying real chips from Nvidia, and hoping that there will be sustainable demand (read: revenue > COGS) from buyers of compute so that they can honor those commitments. If not, the buyers of compute will walk away, resizing their commitments (or going bankrupt), Nvidia already sold those GPUs, and the hyperscalers are left with billions and gigawatts of unused capacity that depreciate very quickly due to the short GPU lifespan. — Read More
How a global company lets its employees build with 30+ LLMs
TELUS is one of Canada’s largest telecom companies. With more than 100,000 employees globally, it’s the very definition of an enterprise.
When it comes to AI, many enterprise companies seem to have the same cookie-cutter approach: deploy GPT-5, add some guardrails, and call it a day.
Not TELUS. Despite their size and all the complexities that come with enterprise-level ops, this global company is thinking about AI in a totally different way. And I want to share what they’re doing with you – I think there’s a lot to take away from their story. — Read More
AI is Rewiring the Economy
It’s cheaper to cover a hole in the wall with a flat screen TV than fill it. Stuff is cheap, services are expensive. AI is about to fix that.
You either believe AI will displace jobs or you think its hype. I think that’s the wrong question. The right question is how does AI reshape the economy.
AI will force us to reconsider commerce, consumerism and the norms of our economy. We will enter a world where consumers buy less stuff, but with much higher conversion. Middle-income consumer populations will have less disposable income as their jobs come under pressure from AI. Meaning consumerism ceases to be the driver of economic growth. — Read More
Boom, bubble, bust, boom. Why should AI be different?
The artificial intelligence revolution will be only three years old at the end of November. Think about that for a moment. In just 36 months AI has gone from great-new-toy, to global phenomenon, to where we are today – debating whether we are in one of the biggest technology bubbles or booms in modern times.
To us what’s happening is obvious. We both covered the internet bubble 25 years ago. We’ve been writing about – and in Om’s case investing in – technology since then. We can both say unequivocally that the conversations we are having now about the future of AI feel exactly like the conversations we had about the future of the internet in 1999.
We’re not only in a bubble but one that is arguably the biggest technology mania any of us have ever witnessed. — Read More