In March 2023, OpenAI launched ChatGPT plugins to massive hype. AI could finally use tools: browsing the web, running code, booking restaurants, searching flights. The plugin store grew to over 1,000 entries. Developers rushed to build integrations.
Twelve months later, the plugin store was dead. OpenAI shut it down on March 19, 2024, and killed remaining plugin conversations by April 9. The replacement, Custom GPTs with Actions, took a different direction entirely. And then in November 2024, Anthropic released Model Context Protocol, which rewrote the rules again.
This is the story of ChatGPT plugins vs MCP: what happened, what replaced what, and why MCP is fundamentally different from everything that came before.
The rise and fall of ChatGPT plugins
ChatGPT plugins launched as a beta in March 2023. The pitch was compelling: third-party developers could build plugins that gave ChatGPT abilities beyond text generation. The initial partners included Expedia, Instacart, Kayak, OpenTable, Shopify, and Wolfram Alpha.
The plugin model worked like this:
- A developer hosted an API and wrote an OpenAPI manifest describing it
- OpenAI reviewed and approved the plugin
- Users could browse the plugin store and enable up to three plugins per conversation
- ChatGPT would call the plugin's API when relevant
The problem was adoption. Despite the hype, most users never enabled plugins. The store was hard to discover, plugins were unreliable, and the three-plugin limit created friction. Many plugins solved problems users did not actually have, or solved them poorly because the AI could not always figure out when and how to call them correctly.
By late 2023, OpenAI had already pivoted. At DevDay in November 2023, they launched Custom GPTs with Actions, a fundamentally different approach that bundled tools into purpose-built chat configurations. The GPT Store launched in January 2024.
The plugin store closed on March 19, 2024. No new installations, no new conversations. Existing conversations with plugins continued briefly until April 9, 2024, when those were shut down too. Plugins were fully dead within 13 months of launch.
What replaced them: GPT Actions and Custom GPTs
Custom GPTs with Actions replaced plugins as OpenAI's tool integration model. The shift was significant.
GPTs are preconfigured chat experiences. A creator defines the GPT's instructions, knowledge files, and available actions. Users interact with a finished product rather than assembling plugins themselves. The GPT Store now hosts over 3 million created GPTs, with roughly 159,000 publicly available.
GPT Actions use OpenAPI specifications (the same format as plugins) but are embedded inside a specific GPT rather than available globally. This means each GPT is self-contained: it knows what tools it has and how to use them. No more hoping the AI figures out which of three enabled plugins to call.
This model works better for end users. Instead of enabling "the flight search plugin" and "the hotel plugin" and "the calendar plugin," you use a "Trip Planner" GPT that has all three actions built in. The creator has already configured the prompts, the tool selection logic, and the data flow.
But GPT Actions have a fundamental limitation: they only work inside OpenAI's ecosystem. A GPT Action built for ChatGPT does not work in Claude, Gemini, Cursor, VS Code, or any other AI tool. If you build an integration using GPT Actions, you are building it for one platform.
Enter MCP: a different philosophy
On November 25, 2024, Anthropic released Model Context Protocol as an open standard. The philosophy was the opposite of both plugins and GPT Actions.
Instead of a platform-specific integration system controlled by one company, MCP is an open protocol that any AI client can implement. Build one MCP server, and it works with Claude, ChatGPT, Gemini, Cursor, VS Code, Windsurf, Zed, and any other client that supports the protocol.
For a complete introduction to how MCP works, see What is MCP? A Plain-English Guide for AI Users.
The technical model is straightforward. An MCP server exposes tools (functions with defined inputs and outputs), resources (data the AI can read), and prompts (reusable templates). The AI client connects to the server, discovers what is available, and uses the tools when relevant. Transport happens over stdio (local processes) or Streamable HTTP (remote servers, introduced in the March 2025 protocol revision).
The critical difference from plugins and GPT Actions is governance. MCP is not controlled by any single company. In December 2025, Anthropic donated MCP to the Agentic AI Foundation under the Linux Foundation. The foundation is co-led by Anthropic, Block, and OpenAI, with support from Google, Microsoft, AWS, Cloudflare, and Bloomberg.
Yes, OpenAI. The company that shut down its own plugin store joined the foundation governing MCP. That tells you everything about where the industry is heading.
ChatGPT plugins vs GPT Actions vs MCP compared
| Feature | ChatGPT Plugins (dead) | GPT Actions (active) | MCP (active) |
|---|---|---|---|
| Status | Shut down April 2024 | Active | Active, growing |
| Governance | OpenAI controlled | OpenAI controlled | Open standard (Linux Foundation) |
| Platform support | ChatGPT only | ChatGPT only | Claude, ChatGPT, Gemini, Cursor, VS Code, Windsurf, Zed, and more |
| Discovery | Plugin store (centralized) | GPT Store (centralized) | Multiple directories (PulseMCP, mcp.so, Anthropic connectors) |
| Setup for users | Enable in plugin store | Use a GPT | Add server URL in settings |
| Setup for developers | OpenAPI manifest + approval | OpenAPI manifest in GPT builder | MCP SDK (10 languages) |
| Approval required | Yes (OpenAI review) | No (anyone can create GPTs) | No (anyone can host a server) |
| Tools per session | Max 3 plugins | Per-GPT (no hard limit) | Unlimited servers |
| Protocol spec | Proprietary | OpenAPI-based, proprietary wrapper | Open spec (modelcontextprotocol.io) |
| Ecosystem size | ~1,000 at peak | 3M+ GPTs, ~159K public | 8,600+ servers (PulseMCP), 17,900+ (mcp.so) |
| SDK languages | N/A (API-only) | N/A (API-only) | 10 official SDKs |
Why OpenAI adopted MCP
In March 2025, OpenAI announced MCP support across the Agents SDK, Responses API, and ChatGPT desktop app. By October 2025, ChatGPT Developer Mode launched with full MCP support. In December 2025, the App Directory and Apps SDK brought MCP to the full ChatGPT ecosystem.
OpenAI had every reason to build their own standard. They had the largest user base, an existing (if struggling) integration ecosystem, and the resources to go alone. They adopted MCP anyway.
The reason is network effects. A protocol is only valuable if tools support it. By early 2025, thousands of MCP servers already existed for everything from databases to design tools to CRM systems. Building a competing standard would mean convincing all those developers to build a second integration. Adopting MCP meant instant access to the entire existing ecosystem.
This is the same dynamic that drove the web to standardize on HTTP, email to standardize on SMTP, and package management to coalesce around npm and pip. Open protocols win because the cost of fragmentation exceeds the benefit of control.
What ChatGPT plugins vs MCP means for developers and users
If you are a developer building AI integrations, MCP is the clear choice. Build one server, and it works everywhere. The official SDKs cover 10 languages: TypeScript, Python, C#, Java, Kotlin, Go, Rust, Swift, Ruby, and PHP. A minimal server takes 30 minutes to build.
If you are a non-developer using AI tools, MCP means you benefit from the same integration regardless of which AI assistant you prefer. An MCP server that publishes documents works in Claude, ChatGPT, and Cursor. You are not locked into one platform's ecosystem.
The practical impact is already visible. Anthropic's connectors directory lists over 50 integrations. Claude can publish documents, search the web, manage your calendar, and interact with dozens of services through MCP. There are now MCP servers for content publishing, web search, file management, and every major productivity platform. The same capabilities are arriving in ChatGPT, Gemini, and every major AI client.
For non-developers, MCP has an even simpler value proposition. You do not need to understand the protocol to benefit from it. AI assistants use MCP tools on your behalf, turning conversations into published documents, managing your calendar, and connecting to your workspace. The tools that used to require plugins now work across every major AI platform.
GPT Actions still have a role for highly customized, single-purpose GPTs. If you need a specific ChatGPT experience with tailored instructions and curated tools, GPT Actions are the right tool. But for general-purpose integrations that work across platforms, MCP has won.
The plugins era taught us that closed, platform-locked integrations do not scale. The MCP era is teaching us that open protocols, backed by industry consensus, do.
