The Model Context Protocol (MCP) is how an AI agent — Cursor, Claude Desktop, Codex, a custom agent — calls a set of tools you publish. Think of it as the contract between “the agent’s reasoning loop” and “the things the agent can actually do.”
There are two ways to ship an MCP server. Most tutorials show you the wrong one first.
Stdio MCP (local)
Your agent spawns a child process, reads/writes JSON over its stdin/stdout. The classic Anthropic example. Easy to wire up locally — node ./server.js and you’re done.
The catch: it only works on the user’s machine. They have to clone your repo, run npm install, set environment variables, and configure their agent to point at the binary. Every install is a 5-step manual operation. Distribution is brutal.
Hosted HTTP MCP (the production path)
Your MCP server runs as a regular HTTPS endpoint. Agents POST JSON-RPC, you respond. Cursor, Claude Desktop (via mcp-remote), Codex, and custom agents can all talk to it.
What you trade: you have to actually run the server (Vercel / Fly / Cloudflare Workers / a VM). What you get: one URL, no installs, every agent in the world can use it the moment they paste your endpoint.
Most production MCP servers should ship hosted by default. Stdio is for local development.
What ChiefLab does
ChiefLab is a hosted MCP at chieflab.io/api/mcp — the closed-loop launch operator for agent-built products. Your agent calls one tool, gets a launch pack + signed approval URL + per-channel briefs. No install required.