OAuth for delegated agents, scoped tokens for B2B, anonymous for read-only. Rate limiting, logging, and the auth decision tree for MCP servers.
EN

MCP authentication patterns: OAuth, tokens, and when to use each

4.60 /5 - (8 votes )
Last verified: May 1, 2026
9min read
Guide
500+ WP projects
AI integration

#MCP authentication patterns: OAuth, tokens, and when to use each

The Model Context Protocol specification (modelcontextprotocol.io) defines transports and primitives. It does not define authentication. That is correct, because authentication is a transport-and-deployment concern, not a protocol concern. It is also the source of the most common production mistake I see: an MCP server shipped to the internet with no auth, exposing tools that mutate state.

This article anchors to the MCP server development service pillar.

#TL;DR

  • Read-only over public data with rate limits is the only legitimate anonymous case.
  • Scoped API tokens with hashed storage and short TTL cover B2B and headless flows.
  • OAuth 2.1 with PKCE covers consumer-facing assistants acting on behalf of a logged-in user.
  • Rate limiting attaches to the principal; mutating tools get a tighter bucket.
  • Every auth event is logged: issued, used, revoked, failed.

#The decision tree

I run the same five-question check before picking an auth pattern:

  1. Does any tool mutate state? If yes, anonymous is off the table.
  2. Does the agent act on behalf of a specific human end user? If yes, OAuth.
  3. Does the agent act as a B2B integration with no end user attached? If yes, scoped API token.
  4. Is the data surface reachable on the public internet? If yes, rate limiting is mandatory.
  5. Are mutating and read-only tools mixed in the same server? If yes, scope the auth per tool, not per server.

The decision tree maps to three patterns:

PatternWhen to useImplementation
Anonymous + IP rate limitPublic catalogue, read-only, bounded loadWorker checks IP bucket only
Scoped API tokenB2B integration, headless agent runtime, no human end userJWT with claims, hashed storage, rotation
OAuth 2.1 + PKCEConsumer agent acting for a logged-in userStandard authorization code flow

The patterns are not mutually exclusive. A real production server often runs all three, with each tool tagged with its required scope.

#Pattern one: anonymous read-only with rate limiting

A catalogue browse tool that wraps /wp-json/wc/v3/products?stock_status=instock exposes the same data the public website already serves. Adding auth in front of it does not improve the security posture; it just reduces accessibility. The legitimate threat is load: an agent loop or a competitor scrape can hammer the WooCommerce origin.

The implementation:

async function checkAnonymousRateLimit(request: Request, env: Env): Promise<boolean> {
  const ip = request.headers.get("CF-Connecting-IP") ?? "unknown";
  const key = `rl:anon:${ip}`;
  const count = Number((await env.RATE_LIMIT.get(key)) ?? "0");
  if (count >= 60) return false;
  await env.RATE_LIMIT.put(key, String(count + 1), { expirationTtl: 60 });
  return true;
}

60 requests per minute per IP is the default I ship. The KV-based counter is approximate (eventual consistency on KV writes); for tighter limits, Durable Objects or Cloudflare’s native Rate Limiting binding is the right tool.

What this pattern does not cover: any tool that returns user-specific data, any tool that mutates state, any tool that reveals stock for non-public products. Those need a real principal.

#Pattern two: scoped API tokens for B2B

The B2B case: a partner integration ships an agent that calls your MCP server on behalf of the partner organisation, not on behalf of a specific human end user. Examples are an inventory-sync agent at a wholesaler, a marketplace integration at an aggregator, an analytics agent that reports order trends to a BI tool.

The flow:

  1. An admin in the WordPress backend creates a token with a name, a set of scopes (catalogue:read, inventory:read, orders:write), and an expiry (default 90 days).
  2. The token is shown once to the admin, then stored as a SHA-256 hash plus the scope set plus the expiry.
  3. The partner configures their MCP client with the token in an Authorization: Bearer <token> header.
  4. The Worker hashes the incoming token and looks up the hash in KV. If the hash matches and the expiry is in the future and the requested tool’s required scope is in the token’s scope set, the call proceeds.
async function verifyApiToken(authHeader: string | null, env: Env): Promise<TokenContext | null> {
  if (!authHeader?.startsWith("Bearer ")) return null;
  const token = authHeader.slice(7);
  const hash = await sha256(token);
  const record = await env.TOKENS.get(`tok:${hash}`, "json") as TokenRecord | null;
  if (!record) return null;
  if (record.expiresAt < Date.now()) return null;
  return { tokenId: record.id, scopes: record.scopes, principal: record.principal };
}

function requireScope(ctx: TokenContext, scope: string): void {
  if (!ctx.scopes.includes(scope)) {
    throw new McpError("forbidden", `Tool requires scope ${scope}`);
  }
}

Two operational habits keep this pattern honest:

Rotation, not eternity. A token that never expires is a token that ends up in a public GitHub repo six months from now. 90-day default with a renewal flow that overlaps the old and new token by 7 days is the pattern that survives real partners.

Hashed storage, not plaintext. If your KV store is exfiltrated, the hashes are useless without the original token. If you stored plaintext tokens, every partner integration needs immediate rotation. The cost difference at issue time is one sha256 call.

#Pattern three: OAuth 2.1 with PKCE for delegated agents

The consumer case: a user opens Claude Desktop, connects their account on your store via OAuth, and asks the agent to “show my last order.” The agent now needs to call your MCP server with credentials that say “this is acting for user 4231.”

OAuth 2.1 (draft-ietf-oauth-v2-1) is the consolidated profile. PKCE (RFC 7636) is mandatory for public clients (which includes desktop assistants without a confidential server-side component).

The flow, in five legs:

  1. The MCP host launches a browser to your authorization endpoint with response_type=code, code_challenge=<S256 hash of verifier>, and the requested scopes.
  2. The user logs into your WordPress site (or your authentication provider) and approves the requested scopes.
  3. Your authorization endpoint redirects back to the host with a one-time authorization code.
  4. The host exchanges the code plus the original code_verifier at your token endpoint for an access token (short TTL, 1 hour) and a refresh token (longer TTL, 30 days).
  5. The host calls the MCP server with Authorization: Bearer <access_token>. The Worker verifies the token’s signature, expiry, and scope set against the called tool.

The access token is a signed JWT. The Worker verifies it with the Web Crypto API (Cloudflare Workers reference) without an external library:

async function verifyJwt(token: string, env: Env): Promise<JwtClaims | null> {
  const [headerB64, payloadB64, sigB64] = token.split(".");
  const data = new TextEncoder().encode(`${headerB64}.${payloadB64}`);
  const sig = base64UrlDecode(sigB64);
  const valid = await crypto.subtle.verify("RS256", env.PUBLIC_KEY, sig, data);
  if (!valid) return null;
  const payload = JSON.parse(new TextDecoder().decode(base64UrlDecode(payloadB64)));
  if (payload.exp * 1000 < Date.now()) return null;
  return payload as JwtClaims;
}

The scopes carried in the JWT mirror the scopes from the scoped-token pattern: catalogue:read, orders:read, orders:write. The user’s WordPress user ID rides in the sub claim, so an orders:read call returns only that user’s orders.

#Mixing the patterns in one server

A real WooCommerce MCP server typically exposes:

  • catalogue.list and product.detail: anonymous + IP rate limit.
  • inventory.check (for partners): scoped token with inventory:read.
  • order.status (for the logged-in user): OAuth with orders:read.
  • order.intent (for the logged-in user): OAuth with orders:write, plus a tighter rate limit.

The Worker pre-dispatch logic walks through the patterns:

async function authenticate(request: Request, toolName: string, env: Env): Promise<Principal> {
  const requirement = TOOL_AUTH_REQUIREMENTS[toolName];
  if (requirement === "anonymous") {
    if (!await checkAnonymousRateLimit(request, env)) throw new McpError("rate_limit");
    return { kind: "anonymous" };
  }
  const auth = request.headers.get("Authorization");
  if (requirement === "api_token") {
    const ctx = await verifyApiToken(auth, env);
    if (!ctx) throw new McpError("unauthorized");
    return { kind: "api_token", ctx };
  }
  if (requirement === "oauth") {
    const claims = auth?.startsWith("Bearer ") ? await verifyJwt(auth.slice(7), env) : null;
    if (!claims) throw new McpError("unauthorized");
    return { kind: "oauth", claims };
  }
  throw new Error(`Unknown auth requirement for ${toolName}`);
}

The TOOL_AUTH_REQUIREMENTS map is the single source of truth for which tool needs which auth mode. No tool should be added without an explicit entry.

#Rate limiting per principal

Anonymous traffic gets a per-IP bucket. Token-authenticated traffic gets a per-token bucket. OAuth traffic gets a per-user bucket. Mutating tools get a tighter ceiling than read tools regardless of principal.

For the WooCommerce shape, the buckets I default to:

Tool categoryAnonymousTokenOAuth
catalogue.* (read)60 / minute / IP600 / minute / token120 / minute / user
inventory.* (read)not allowed300 / minute / tokennot allowed
order.status (read)not allowed60 / minute / token60 / minute / user
order.intent (write)not allowed30 / minute / token10 / minute / user

The numbers are starting points; the right values come from observing real traffic for two weeks and tuning. The structure is what matters.

#Logging the auth events

Every auth-relevant event lands in the log:

  • Token issued. Admin user, target principal, scopes, expiry.
  • Token used. Token ID, tool name, principal, latency, success/failure.
  • Token revoked. Token ID, who revoked, why.
  • Token failed. Reason (expired, missing scope, hash mismatch), IP, user agent.
  • OAuth code exchanged. User ID, scopes granted, refresh token issued.
  • OAuth refresh. User ID, new access token issued, old token superseded.

The logs go to Cloudflare Logpush into a long-term store. A dashboard query that watches “tokens used in the last 24 hours that have not been used in the previous 90 days” catches likely token theft. A query that watches “failed token verifications by IP” catches credential-stuffing attempts.

#Where this fits in the cluster

This article covers the auth surface. For the implementation walkthrough see building an MCP server for WooCommerce. For typed tool definitions see writing typed catalogue tools with Zod for MCP. For the protocol-level decision see MCP vs REST. For the migration pathway from an existing API see migrating an existing WordPress API to MCP. The pillar is MCP server development.

Pricing is individual because the auth scope depends on which patterns your environment requires; an anonymous-only read server is a different engagement than a full OAuth-issuing surface.

Next step

Turn the article into an actual implementation

This block strengthens internal linking and gives readers the most relevant next move instead of leaving them at a dead end.

Article FAQ

Frequently Asked Questions

Practical answers to apply the topic in real execution.

SEO-ready GEO-ready AEO-ready 5 Q&A
Does the MCP spec require authentication?
No. The Model Context Protocol specification leaves authentication to the transport layer. Stdio transports are usually trusted by virtue of running locally. HTTP transports facing the internet require an authentication layer that the SDK does not provide out of the box.
When is an anonymous MCP server acceptable?
When every exposed tool is read-only over public data and the load impact is bounded by IP-based rate limits. A catalogue browse tool that wraps /wp-json/wc/v3/products?stock_status=instock is acceptable. An order lookup tool is not.
Why OAuth 2.1 specifically?
OAuth 2.1 consolidates the modern guidance from OAuth 2.0 plus PKCE, drops the deprecated implicit and resource-owner-password flows, and matches what Claude Desktop and other MCP hosts support natively for delegated agent access.
Where do scoped tokens live?
Tokens are issued by an admin surface on the WordPress side, persisted with their hashed value plus scopes plus expiry, and verified on every MCP request. Storing them in plaintext is the same mistake as storing passwords in plaintext.
How does rate limiting interact with auth?
Rate limits attach to the principal: an authenticated token has its own bucket, anonymous traffic shares a per-IP bucket. The mutating tools get a tighter bucket regardless of principal so a compromised token cannot loop on order.intent in a tight loop.

Need an FAQ tailored to your industry and market? We can build one aligned with your business goals.

Let’s discuss

Related Articles

A practical walkthrough of building a Model Context Protocol server in front of WooCommerce. Tool definitions, catalogue and order endpoints, schema.org alignment, Zod validation, and a Cloudflare Workers deployment that an AI agent can talk to.
wordpress

Building an MCP server for WooCommerce: a practitioner's guide

A practical walkthrough of building a Model Context Protocol server in front of WooCommerce. Tool definitions, catalogue and order endpoints, schema.org alignment, Zod validation, and a Cloudflare Workers deployment that an AI agent can talk to.

A decision guide for picking between Model Context Protocol and a REST API when the consumer is an AI agent. Typed surface vs JSON shape inference, mutating actions, authentication, and the hybrid pattern that often beats both.
wordpress

MCP vs REST: when each wins for AI agent integration

A decision guide for picking between Model Context Protocol and a REST API when the consumer is an AI agent. Typed surface vs JSON shape inference, mutating actions, authentication, and the hybrid pattern that often beats both.

A four-week migration playbook for putting a Model Context Protocol server in front of an existing WordPress REST API. Endpoint audit, MCP scaffold, parallel-run, cutover, and the observability that makes the move safe.
wordpress

Migrating an existing WordPress API to MCP: a 4-week playbook

A four-week migration playbook for putting a Model Context Protocol server in front of an existing WordPress REST API. Endpoint audit, MCP scaffold, parallel-run, cutover, and the observability that makes the move safe.