… Some time ago I wrote about “the 70% problem” – where AI coding took you to 70% completion, then leave the final 30% last mile for humans. That framing may now be evolving. The percentage may shift to 80% or higher for certain kinds of projects, but the nature of the problem changed more dramatically than the numbers suggest.
Armin Ronacher’s poll of 5,000 developers compliments this story: 44% now write less than 10% of their code manually. Another 26% are in the 10-50% range. We’ve crossed a threshold. But here’s what the triumphalist narrative misses: the problems didn’t disappear, they shifted. And some got worse. — Read More
Author Archives: Rick's Cafe AI
Why We’ve Tried to Replace Data Analytics Developers Every Decade Since 1974
This article was inspired by Stephan Schwab’s excellent piece “Why We’ve Tried to Replace Developers Every Decade Since 1969” which traces the recurring dream of eliminating software developers from COBOL through to AI. Reading it, I recognised the same pattern playing out in my own field of data warehousing, data analytics and business intelligence; a fifty-year cycle of tools promising to democratise data work, each delivering genuine value while leaving the fundamental need for specialists stubbornly intact.
Every decade brings new promises: this time, we’ll finally make building analytics platforms simple enough that we won’t need so many specialists. From SQL to OLAP to AI, the pattern repeats. Business leaders grow frustrated waiting months for a data warehouse that should take weeks, or weeks for a dashboard that should take days. Data teams feel overwhelmed by request backlogs they can never clear. Understanding why this cycle persists for fifty years reveals what both sides need to know about the nature of data analytics work. — Read More
The private cloud returns for AI workloads
A North American manufacturer spent most of 2024 and early 2025 doing what many innovative enterprises did: aggressively standardizing on the public cloud by using data lakes, analytics, CI/CD, and even a good chunk of ERP integration. The board liked the narrative because it sounded like simplification, and simplification sounded like savings. Then generative AI arrived, not as a lab toy but as a mandate. “Put copilots everywhere,” leadership said. “Start with maintenance, then procurement, then the call center, then engineering change orders.”
… The most valuable AI use cases were those closest to people who build and fix things. Those people lived near manufacturing plants with strict network boundaries, latency constraints, and operational rhythms that don’t tolerate “the provider is investigating.” Within six months, the company began shifting its AI inference and retrieval workloads to a private cloud located near its factories, while keeping model training bursts in the public cloud when it made sense. It wasn’t a retreat. It was a rebalancing. — Read More
AI models are showing a greater ability to find and exploit vulnerabilities on realistic cyber ranges
In a recent evaluation of AI models’ cyber capabilities, current Claude models can now succeed at multistage attacks on networks with dozens of hosts using only standard, open-source tools, instead of the custom tools needed by previous generations. This illustrates how barriers to the use of AI in relatively autonomous cyber workflows are rapidly coming down, and highlights the importance of security fundamentals like promptly patching known vulnerabilities. — Read More
#cyberThe Adolescence of Technology
There is a scene in the movie version of Carl Sagan’s book Contact where the main character, an astronomer who has detected the first radio signal from an alien civilization, is being considered for the role of humanity’s representative to meet the aliens. The international panel interviewing her asks, “If you could ask [the aliens] just one question, what would it be?” Her reply is: “I’d ask them, ‘How did you do it? How did you evolve, how did you survive this technological adolescence without destroying yourself?” When I think about where humanity is now with AI—about what we’re on the cusp of—my mind keeps going back to that scene, because the question is so apt for our current situation, and I wish we had the aliens’ answer to guide us. I believe we are entering a rite of passage, both turbulent and inevitable, which will test who we are as a species. Humanity is about to be handed almost unimaginable power, and it is deeply unclear whether our social, political, and technological systems possess the maturity to wield it. — Read More
Building Brains on a Computer
I first heard people seriously discussing the prospect of “running” a brain in silico back in 2023. Their aim was to emulate, or replicate, all the biological processes of a human brain entirely on a computer.
In that same year, the Wellcome Trust released a report on what it would take to map the mouse connectome: all 70 million neurons. They estimated that imaging would cost $200-300 million and that human proofreading, or ensuring that automated traces between neurons were correct, would cost an additional $7-21 billion. Collecting the images would require 20 electron microscopes running continuously, in parallel, for about five years and occupy about 500 petabytes. The report estimated that mapping the full mouse connectome would take up to 17 years of work.
Given this projection — not to mention the added complexity of scaling this to human brains — I remember finding the idea of brain emulation absurd. Without a map of how neurons in the brain connect with each other, any effort to emulate a brain computationally would prove impossible. But after spending the past year researching the possibility (and writing a 175-page report about it), I’ve updated my views. — Read More
The Duelling Rhetoric at the AI Frontier
At Davos 2026, Anthropic CEO Dario Amodei told a room full of the world’s most influential investors that AI would replace “most, maybe all” of what software engineers do within six to twelve months. A few hours later, Google DeepMind CEO Demis Hassabis took the same stage and said current AI systems are “nowhere near” human-level intelligence, and that we probably need “one or two more breakthroughs” before AGI arrives.
Both men run frontier AI labs. Both have access to roughly the same benchmarks, papers, and internal capabilities data. Yet their public forecasts diverge so dramatically that at least one of them must be either wrong or strategically misleading. The interesting question is which, and why. — Read More
AI and jobs: The decline started before ChatGPT
You’ve probably seen the headlines: AI might be killing jobs for the young. A widely-shared academic paper – the “canaries in the coal mine” paper by Stanford colleagues – found a 16% employment decline for young workers (ages 22-25) in AI-exposed occupations since ChatGPT launched in November 2022. The implication seems clear: AI is already eliminating the first rung of the career ladder, and we’re witnessing the beginning of a massive technological displacement.
It’s a compelling narrative, and it matches our fears. After all, if AI can write code and answer customer queries, why would companies hire junior people to do those things?
But a new paper from the Economic Innovation Group looks more carefully at the data. And when you do, the story becomes a lot less clear. The paper is by Zanna Iscenko (AI & Economy Lead, Chief Economist’s Team), and Fabien Curto Millet (Chief Economist), both at Google. — Read More
Google’s Demis Hassabis, Anthropic’s Dario Amodei Debate the World After AGI
Introducing The Eleven Album
Today, ElevenLabs launches The Eleven Album, a landmark musical release created in collaboration with world-class artists and powered by Eleven Music, our model for generating fully original, studio-quality compositions.
Spanning rap, pop, R&B, EDM, cinematic scoring, and global sounds, the album brings together GRAMMY-winning legends, chart-topping producers, and next-generation creators to explore what’s possible when artists and AI create together. — Read More