In November 2024, Anthropic quietly released a spec that is becoming the USB-C of LLMs: the Model Context Protocol.
What MCP solves
Before MCP, every tool integration was bespoke. MCP standardizes LLM-to-tool communication via a simple JSON-RPC API. One MCP server works with Claude, ChatGPT, Gemini and all compatible clients.
Three primitives
- Tools: functions exposed to the LLM.
- Resources: read data the LLM can consult.
- Prompts: reusable prompt templates.
Why it changes everything
MCP is to AI what REST was to the web: an abstraction decoupling consumers from providers. Official spec at modelcontextprotocol.io.