Picture this: You’re sitting in a conference room, halfway through a vendor pitch. The demo looks solid, and pricing fits nicely under budget. The timeline seems reasonable too. Everyone’s nodding along.You’re literally minutes away from saying yes.Then someone from your finance team walks in. They see the deck and frown. A few minutes later, they shoot you a message on Slack: “Actually, I threw together a version of this last week. Took me 2 hours in Cursor. Wanna take a look?”Wait… what?This person doesn’t code. You know for a fact they’ve never written a line of JavaScript in their entire life. But here they are, showing you a working prototype on their laptop that does… pretty much exactly what the vendor pitched. Sure, it’s got some rough edges, but it works. And it didn’t cost six figures. Just two hours of their time.Suddenly, the assumptions you walked in with — about how software is developed, who makes it and how decisions are made around it — all start coming apart at the seams.The old frameworkFor decades, every growing company asked the same question: Should we build this ourselves, or should we buy it?And, for decades, the answer was pretty straightforward: Build if it’s core to your business; buy if it isn’t.The logic made sense, because building was expensive and meant borrowing time from overworked engineers, writing specs, planning sprints, managing infrastructure and bracing yourself for a long tail of maintenance. Buying was faster. Safer. You paid for the support and the peace of mind.But something fundamental has changed: AI has made building accessible to everyone. What used to take weeks now takes hours, and what used to require fluency in a programming language now requires fluency in plain English.When the cost and complexity of building collapse this dramatically, the old fr …