Anthropic’s open source standard, the Model Context Protocol (MCP), released in late 2024, allows users to connect AI models and the agents atop them to external tools in a structured, reliable format. It is the engine behind Anthropic’s hit AI agentic programming harness, Claude Code, allowing it to access numerous functions like web browsing and file creation immediately when asked.But there was one problem: Claude Code typically had to “read” the instruction manual for every single tool available, regardless of whether it was needed for the immediate task, using up the available context that could otherwise be filled with more information from the user’s prompts or the agent’s responses.At least until last night. The Claude Code team released an update that fundamentally alters this equation. Dubbed MCP Tool Search, the feature introduces “lazy loading” for AI tools, allowing agents to dynamically fetch tool definitions only when necessary. It is a shift that moves AI agents from a brute-force architecture to something resembling modern software engineering—and according to early data, it effectively solves the “bloat” problem that was threatening to stifle the ecosystem.The ‘Startup Tax’ on AgentsTo understand the significance of Tool Search, one must understand the friction of the previous system. The Model Context Protocol (MCP), released in 2024 by Anthropic as an open source standard was designed to be a universal standard for connecting AI models to data sources and tools—everything from GitHub repositories to local file systems.However, as the ecosystem grew, so did the “startup tax.”Thariq Shihipar, a member of the technical staff at Anthropic, highlighted the scale of the problem in the announcement.”We’ve found that MCP servers may have up to 50+ tools,” Shihipar wrote. “Users were documenting setups with 7+ servers consuming 67k+ tokens.”In practical terms, this meant a developer using a robust set of tools might sacrifice 33% or more of thei …