The Modern Data Stack is ending, but not because technology failed. It’s ending because vendors realised they can sell the illusion of unification while locking you in.
The ecosystem that birthed the Modern Data Stack has matured and vendors have begun to see the endgame. The promise of modularity, flexibility, and best-of-breed choices is giving way to a new narrative: unification, at any cost. The latest whispers of a $5–10 billion Fivetran-dbt merger make this reality undeniable.
But this “seamlessness” is not unification in the architectural sense; it is unification in the narrative. Users are drawn into the story: one contract, one workflow, one vendor to call. But the vendor is locking you in before the market fully stabilises.
Looks like simplification, but is actually enclosure. The illusion of a single platform conceals multiple stitched-together layers, each still bound by its own limitations, yet now difficult to escape. This is not just a vendor play, it is a structural shift, a reordering of the data ecosystem that forces practitioners to question what “unified” really means. — Read More
Tag Archives: Architecture
Not Everything Is an LLM: 8 AI Model Types You Need to Know in 2025
In 2023, if you said “AI”, most people thought of ChatGPT.
Fast-forward to 2025, and the landscape looks very different. LLMs (Large Language Models) may have ignited the AI revolution, but now we’re deep into an era of specialized AI models, each designed with a specific superpower.
Yet, somehow, everyone still calls them LLMs.
It’s like calling every vehicle a “car”, whether it’s a bicycle, a truck, or a plane. Sure, they all move, but they’re built for very different purposes. — Read More
Emerging Architectures for LLM Applications
Large language models are a powerful new primitive for building software. But since they are so new—and behave so differently from normal computing resources—it’s not always obvious how to use them.
In this post, we’re sharing a reference architecture for the emerging LLM app stack. It shows the most common systems, tools, and design patterns we’ve seen used by AI startups and sophisticated tech companies. This stack is still very early and may change substantially as the underlying technology advances, but we hope it will be a useful reference for developers working with LLMs now. — Read More
Architectures
These are the lecture notes for FAU’s YouTube Lecture “Deep Learning”. This is a full transcript of the lecture video & matching slides. We hope, you enjoy this as much as the videos. Of course, this transcript was created with deep learning techniques largely automatically and only minor manual modifications were performed. If you spot mistakes, please let us know!
Part 1
Part 2
Man’s Search for the most accurate Neural Network Architecture
Neural network architecture design is one of the key hyperparameters in solving problems using deep learning and computer vision. Various neural networks are compared on two key factors i.e. accuracy and computational requirement. In general, as we aim to design more accurate neural networks, the computational requirement increases. In this post, we shall learn about the search for more accurate neural network architectures without worrying about computational need. We shall also see how neural networks can be taught to design themselves and how this technique is being used to discover better neural network architectures(AutoML or Neural Architecture Search).

Now that We’ve Got AI What do We do with It?
Whether you’re a data scientist building an implementation case to present to executives or a non-data scientist leader trying to figure this out there’s a need for a much broader framework of strategic thinking around how to capture the value of AI/ML.
Let’s start by just enumerating the broad categories of AI/ML business models. Most of us agree there are at least these four.
AI/ML Infrastructure
AI-First Full Stack Vertical Platforms
Applied AI – Optimization of the Current Business Model
Platformication – A Radical End Point for AI/ML Strategy
Read More
How to Configure the Number of Layers and Nodes in a Neural Network
Artificial neural networks have two main hyperparameters that control the architecture or topology of the network: the number of layers and the number of nodes in each hidden layer.
You must specify values for these parameters when configuring your network.
The most reliable way to configure these hyperparameters for your specific predictive modeling problem is via systematic experimentation with a robust test harness. Read More
Explaining AI from a Life cycle of data
Explaining AI from the perspective of the life-cycle of Data is useful because more people are used to data (than to code). I welcome comments on this approach. Read More
Google AI Chief Jeff Dean’s ML System Architecture Blueprint
ML has revolutionized vision, speech and language understanding and is being applied in many other fields. That’s an extraordinary achievement in the tech’s short history and even more impressive considering there is still no dedicated ML hardware. Read More