kebaux

How MCP Works: One Protocol For Every AI Tool

You've asked an AI assistant to search the web, read a file, or query a database. Each of those required a separate integration: custom code that connects one AI application to one external service. If you wanted three AI apps to use five tools, someone had to build fifteen connectors. MCP, the Model Context Protocol, replaces that with a single standard interface.

The problem is combinatorial. Every AI application (Claude, ChatGPT, Cursor, your own agent) needs to talk to every external service (GitHub, Slack, Postgres, the filesystem). Without a shared protocol, each pairing requires its own connector with its own message format, authentication flow, and error handling. Add a new tool, and every AI client needs a new integration. Add a new AI client, and every tool needs a new one. The number of connectors grows as the product of clients and servers: the N×M problem.

MCP collapses that product into a sum. Each AI application implements one MCP client. Each external service implements one MCP server. The client and server speak the same wire protocol, JSON-RPC 2.0, so any client can talk to any server without custom glue code. Five clients and five servers means ten implementations instead of twenty-five. The savings compound as the ecosystem grows.

An MCP server exposes three primitives. Tools are functions the AI can call: run a database query, create a GitHub issue, send a message. Resources are data the AI can read: file contents, API responses, database records. Prompts are reusable templates that structure how the AI interacts with the server. A client discovers what a server offers by calling list methods (tools/list, resources/list), then invokes what it needs.

The conversation flow works like this: you ask the AI a question that requires external data. The AI, acting as an MCP client, recognizes it needs a tool. It sends a JSON-RPC request to the appropriate MCP server specifying the tool name and parameters. The server executes the operation, whether that's a SQL query, an API call, or a file read, and returns the result as structured content. The AI incorporates that result into its response. The entire exchange follows the same protocol regardless of which client or server is involved.

MCP is why the list of tools available to AI assistants is growing so quickly: building a new integration means implementing one server that works with every client, instead of a custom connector for each one.




#LLMs #infrastructure #mcp