A take as of March 2026 - models are getting good enough to call most APIs directly, which moves the line on what an MCP is actually for.

The Problem

MCP (Model Context Protocol) servers were supposed to be the standard way AI tools connect to external services. In practice, most are thin wrappers around REST APIs that add maintenance overhead without proportional value.

Every MCP server is another stdio process that can crash, another npm package that can break, another credential pipe that needs plumbing. When Claude can already call APIs directly with curl, why maintain the abstraction?

Where MCPs Still Earn Their Keep

Not all MCPs are equal. The ones worth keeping share a pattern: they handle complexity you genuinely don’t want to replicate every session.

Complex auth flows - Google Workspace needs OAuth2 token refresh across multiple scopes and services. Doing this raw with curl every session would be painful. The MCP handles the token lifecycle invisibly.

High-surface-area APIs - Cloudflare has 80+ operations across Workers, D1, KV, R2, queues, durable objects. The MCP provides structured discovery. Could you figure it out from docs? Yes. But the MCP makes it instant.

Stateful protocols - anything needing persistent connections or session state is harder to replicate with one-shot curl calls.

Where MCPs Are Already Dead Weight

Simple API wrappers - if the MCP just wraps a REST endpoint with an API key header, skip it. A curl call does the same thing with zero maintenance. We run Calendly and Kit this way - direct API calls, no MCP, works fine.

Search/research tools - Perplexity’s argument is strongest here. A model that can read docs and call APIs doesn’t need a search MCP. It can figure out the right endpoint, craft the request, and parse the response.

Low-frequency integrations - if you use a service twice a month, the MCP overhead (keeping it running, updating packages, managing credentials) isn’t justified. Just call the API when you need it.

The Transition Pattern

MCPs are following a familiar technology lifecycle:

  1. Early stage (2024-2025): models can’t reliably call APIs on their own. MCPs provide guardrails and structure.
  2. Current stage (2025-2026): models are good enough to call most APIs directly. MCPs add value only for complex auth, high-surface-area APIs, and stateful protocols.
  3. End state: models read API docs on the fly, manage auth flows autonomously, and remember API patterns across sessions. MCPs become redundant.

We’re solidly in stage 2. The practical move is to audit which MCPs genuinely earn their overhead and replace the rest with direct API calls.

Our Workspace Audit

MCPVerdictWhy
Google WorkspacekeepOAuth2 complexity, multi-service auth, token refresh
Cloudflarekeep (for now)80+ operations, structured discovery helps. Could go direct for the 5 we actually use
LemlistreplaceThin API wrapper, low frequency
Perplexityalready removedNo account, and direct search works fine
HubSpotevaluateOAuth complexity says keep, usage frequency says maybe not

The Broader Lesson

The AI tool ecosystem is going through a phase where every integration gets its own protocol server. It feels like the early days of REST when everything needed a dedicated client library before developers realized they could just call the endpoints directly.

MCPs will stick around for genuinely complex integrations (OAuth dance, WebSocket state, multi-step protocols). But for the 80% of integrations that are “add an auth header and call a JSON endpoint” - they’re already unnecessary. The model is smart enough.

Don’t build infrastructure for problems the model has already outgrown.