Why Renting AI Is The New Fairytale
Renting GPT-4 was cool in 2024. Owning Llama 4 is how you build a moat in 2025.
Quick Insights 🔎
Startup snippets
Marketers still missing hidden influence
Nearly 80% of marketers admit full-funnel attribution still eludes them. They're stuck crediting first or last clicks, completely missing the hidden influence in the messy middle.
OpenAI launches Codex, its new coding agent
OpenAI just rolled out Codex, an AI agent for developers that can write code, squash bugs, and run tests. The tool is now live for ChatGPT Pro users, with more rollouts coming.
Google drops NotebookLM apps on mobile
NotebookLM, Google’s AI-powered research sidekick, is finally live on Android and iOS just in time for I/O 2025.
Startup mental model 🧠
Build, Own, Compound.
Renting AI is fast and easy. But owning your AI stack creates real leverage. Open models are strategic infrastructure. Build it, own it, compound it, and leave renters behind.
Reality Check ✅
Open-Source AI Will Eat the World
Let’s get this out of the way. I’m not an open-source AI expert.
I’ve been figuring this out in real time, mostly by watching founders and friends do smart things and asking a lot of dumb questions. This breakdown is as much for me as it is for you.
But here’s what’s already obvious, owning your AI stack isn’t some fringe hacker project anymore. It’s becoming the move for teams who actually want control.
Renting GPT-4 was the flex in 2024. But in 2025, and the savvier teams are quietly building moats using open models like Llama and DeepSeek.
Open source AI is about cutting API bills, speeding up ops, and keeping your data in-house where it belongs.
In this breakdown:
✅ Where open models win and where they still fall short.
✅ How to get started without a PhD or burning your laptop.
✅ What founders wish they knew before starting.
Open source AI is like getting the recipe, not just the dish.
With open source models you own the code, the weights, the whole thing. You can poke it, twist it, retrain it, or rip it apart if you need to.
Let’s compare both approaches:
Open-source: It’s like cooking at home. You see exactly what's going in the dish and can tweak the recipe any way you want.
Closed models (OpenAI, Anthropic): It's like Uber Eats you get a polished product delivered straight to your door. Easy, but no control.
Why should startups care?
Because open source is catching up fast.
Meta’s Llama family blew past 1B downloads in March 2025 and hit 1.2B by late April.
DeepSeek R1 claims GPT-4-level scores after ≈ $6-7M compute-only. Hardware and data extras not included. GPT-4 reportedly cost OpenAI $100M. Ouch.
The biggest mistake I see is assuming open models are automatically worse and then getting blindsided when setup and fine-tuning aren’t plug-and-play. People expect ChatGPT-level polish out of the box. But open source isn’t worse. It’s raw. And if you treat it like a shortcut instead of infrastructure, you’ll burn time and get nowhere.
Three reasons startups are making the switch:
Slash costs. GPT-4.1: $2 in/$8 out per million tokens. Self-hosted Llama 3? Drops below a dollar. More runway, fewer headaches.
Control your voice. Fine-tune to match your industry slang and workflows, without begging for API permission.
Own your data. No more hidden retention clauses. Your model, your network, your rules. Auditors love it.
The old playbook is getting burned
Many teams are still locked into the “rent-everything” model:
Pay-per-token on every experiment
Limited by someone else’s product roadmap
Data goes to a third party, no questions asked
Relying on closed APIs is like building your product on rented land. It’s expensive, restricted, and one policy update away from breaking everything.
The teams who get this are using open source AI to:
Build custom automations that save on API bills
Tune models for industry slang, internal workflows, or even just local accents
Keep sensitive data inside their walls, not inside a black box in California
How startups are using open source AI
Sevilla FC’s Scouting Revolution
Sevilla FC, a Spanish football club, built a custom “Scout Advisor” using Meta’s Llama. It’s been reported by Meta’s own technical update.
Before: Scouts wasted 200 hours grinding through player reports.
After: Done in 2 minutes, just as good, often better.
Sofya Health’s Clinical Note Assistant
US-based Sofya Health built an in-house note generator for doctors, training Llama on anonymised clinical notes and deploying it inside their own network.
Before: Doctors wasted 30% of consults buried in admin.
After: Instant notes, perfect compliance, zero leaks. All within their system.
But don’t be fooled. This stuff is hard.
Here’s where most teams mess up:
Plug-and-play is a myth: You'll need infrastructure, GPUs, and technical savvy.
Watch out for hallucinations: Deploy RAG systems, rigorous QA, and smart prompts.
Don’t skip security: Open weights mean potential risks; sandbox carefully.
No, you don’t need a PhD. But yes, it takes work.
I’ve watched this play out with one founder I know who spun up open models on a Mac Studio, just to test DeepSeek R1. It nearly melted the machine. But after a bit of tuning (and some clever model swaps), they had a self-hosted AI assistant running internal ops, no API fees, no data leaks, and full control.
How to make it work
Pick an annoying task. Automate it away (e.g., support tickets).
Choose your model. Llama 4 or DeepSeek
Deploy fast. Hugging Face for simplicity, llama.cpp to dig deeper.
Tune quickly. Just 50 samples move from generic to pro.
Feedback loop. Track improvements daily. Weights & Biases helps.
Arbitrage like it’s 2012 again
A few years from now, everyone will be running some version of this playbook. But right now, the cost to experiment is low and the upside is high.
Think of it as the AWS moment for cognition. The winners will be the ones with the sharpest systems and the most leverage.
Final thoughts
Open source AI cuts costs, speaks your language, and keeps your secrets safe. Why get held hostage by pricing tiers?
Paying premium prices to rent models is a badge of outdated thinking. This is a power shift. In five years, every serious startup will run on open models. The only question is whether you’ll lead or lag.
Sure, proprietary AI still leads slightly in raw performance. But that gap is shrinking fast. Open source offers the control, customisation, and cost-effectiveness startups crave.
Stop renting leverage. Build, own, compound.
Until next week, keep building, no fairytales required.
Martin, Chief Ranter at Uncharted
Great read! I like the direction to open source for sure.