MCP Is the USB Port of AI — But Is It Already Dead?

4 min read

In November 2024, Anthropic quietly released the Model Context Protocol. A standard way for AI models to connect to external tools and data sources. Think of it like USB for AI — one protocol, any tool, any model.

Fourteen months later, MCP has 97 million monthly SDK downloads. Over 5,000 registered servers. OpenAI adopted it. Google adopted it. Microsoft put it in Copilot Studio. In December 2025, Anthropic donated MCP to the Linux Foundation’s Agentic AI Foundation, with OpenAI and Block as co-founders and AWS, Google, Microsoft, and Cloudflare as supporting members.

By every metric, MCP won.

So why are developers saying it’s dead?

The efficiency problem

The case against MCP comes down to one thing: tokens.

MCP tools are expensive to use. Every MCP server you connect dumps its tool definitions into the model’s context window. Warp’s engineering team found that 90% of tasks that have MCP context available don’t actually need it. The tools are just sitting there, eating tokens, doing nothing.

The numbers are stark. Vercel’s agent-browser team measured a 70% token reduction when they switched from Chrome DevTools MCP to a CLI-based approach. MCP tool definitions alone consumed 4,000 tokens. The CLI equivalent? 95 tokens. For a single browser automation task, MCP consumed 38,000 tokens. The CLI approach: 12,000.

Jason Zhou, who built Vercel’s agent-browser, put it bluntly: MCP is becoming a dead concept. Developers are increasingly just building CLIs instead.

And he’s not alone. Ben Tossell, founder of Ben’s Bites, says he prefers CLI versions of tools — Supabase, Vercel, GitHub, Linear — over their MCP integrations. Less token overhead. More control. Only the tools you need, not the entire suite.

The other side

But here’s what the “MCP is dead” crowd is missing: they’re mostly developers building for developers.

MCP wasn’t designed for people who are comfortable writing CLI commands. It was designed to be a universal standard that any application can implement. The USB analogy is apt — USB didn’t win because it was the most efficient protocol. It won because it was the most universal.

The efficiency arguments are real, but they’re being solved. Warp built a search subagent architecture that only loads relevant MCP tools on demand, achieving a 46.9% token reduction when using multiple servers. Instead of stuffing everything into the context window, a smaller model searches available tools and returns only what’s needed. The main agent never sees the rest.

And the ecosystem numbers are hard to argue with. 5,000+ servers. Every major cloud provider shipping managed MCP endpoints. Google just added gRPC transport support in February 2026. January saw the launch of MCP Apps — UI capabilities for dashboards, forms, and multi-step workflows. The protocol is expanding, not contracting.

The real question

The debate isn’t actually “MCP vs CLI.” It’s about where each approach makes sense.

CLI wins when you’re a developer, working in a terminal, debugging code, and you know exactly which tools you need. The token savings are real and the workflow is natural.

MCP wins when you need a universal interface. When you’re building a product that connects to arbitrary services. When your users aren’t developers. When you want to plug in a Figma integration, a database connector, and a calendar tool without writing custom code for each one.

The comparison that matters isn’t efficiency per token. It’s total cost of integration. Building and maintaining custom CLI wrappers for every service you use is cheap when you have five tools. It’s expensive when you have fifty.

What this means if you’re building

If you’re building AI-powered software today, the practical answer is both:

  • Use CLIs for your core developer workflow. The token savings compound.
  • Support MCP as the integration layer for everything else. It’s where the ecosystem is, and fighting a standard backed by Anthropic, OpenAI, Google, Microsoft, and the Linux Foundation isn’t a winning strategy.

The “MCP is dead” take is the kind of thing that sounds smart on Twitter and ages poorly in production. USB wasn’t the most efficient serial protocol either. It just won.

MCP has problems — token overhead, security gaps, the cold-start cost of loading tool definitions. These are engineering problems. They’re being solved. The protocol itself is becoming infrastructure.

The question isn’t whether MCP survives. It’s whether your product plugs into the ecosystem or gets left out of it.