by VeriTeknik
Aggregates multiple MCP servers into a single unified interface, offering an AI playground for testing, RAG document search, real‑time notifications, and support for both STDIO and Streamable HTTP transports.
The proxy acts as a middleware layer that unifies access to any number of Model Context Protocol (MCP) servers. It fetches tool, prompt, and resource definitions from the plugged.in App and routes client calls to the appropriate downstream servers, turning a fragmented AI ecosystem into a single, searchable, and manageable endpoint.
npx -y @pluggedin/pluggedin-mcp-proxy@latest --pluggedin-api-key YOUR_API_KEY
Default mode runs as a STDIO process, ideal for desktop MCP clients.--transport streamable-http
and --port <number>
to expose a REST‑like endpoint.pluggedin_discover_tools
, pluggedin_rag_query
, and pluggedin_send_notification
directly from the client.Q: Do I need to install anything besides Node.js?
A: No. The proxy is delivered as an npm package; running it with npx
downloads the binary on‑the‑fly.
Q: Can I run the proxy behind a firewall?
A: Yes. Use the Streamable HTTP mode with --require-api-auth
and supply a bearer token for each request.
Q: How does discovery caching work?
A: pluggedin_discover_tools
first returns cached tool lists (<1 s). A background refresh updates the cache if force_refresh:true
is supplied or when no cache exists.
Q: What transport should I choose? A: STDIO is simplest for desktop clients. HTTP mode is required for remote integrations, container deployments, or when you need explicit port exposure.
Q: Is there a way to limit which tools are exposed? A: Tools are filtered per workspace configuration in the plugged.in App; only the enabled server‑side tools are advertised to the client.
Q: How are documents versioned? A: Each AI‑generated document stores a SHA‑256 hash of its content, model metadata, and a parent reference for previous versions, enabling full history tracking.
Q: Are there any licensing concerns? A: The project is released under the MIT License, allowing free commercial and private use.
The plugged.in MCP Proxy Server is a powerful middleware that aggregates multiple Model Context Protocol (MCP) servers into a single unified interface. It fetches tool, prompt, and resource configurations from the plugged.in App and intelligently routes requests to the appropriate underlying MCP servers.
This proxy enables seamless integration with any MCP client (Claude, Cline, Cursor, etc.) while providing advanced management capabilities through the plugged.in ecosystem.
⭐ If you find this project useful, please consider giving it a star on GitHub! It helps us reach more developers and motivates us to keep improving.
ai_generated
, upload
, or api
sourcesThe proxy provides two distinct categories of tools:
These tools are built into the proxy and work without any server configuration:
pluggedin_discover_tools
- Smart discovery with caching for instant resultspluggedin_rag_query
- RAG v2 search across your documents with AI filtering capabilitiespluggedin_send_notification
- Send notifications with optional email deliverypluggedin_create_document
- (Coming Soon) Create AI-generated documents in your libraryThese tools come from your configured MCP servers and can be turned on/off:
The discovery tool intelligently shows both categories, giving AI models immediate access to all available capabilities.
# Quick discovery - returns cached data instantly
pluggedin_discover_tools()
# Force refresh - shows current tools + runs background discovery
pluggedin_discover_tools({"force_refresh": true})
# Discover specific server
pluggedin_discover_tools({"server_uuid": "uuid-here"})
Example Response:
## 🔧 Static Built-in Tools (Always Available):
1. **pluggedin_discover_tools** - Smart discovery with caching
2. **pluggedin_rag_query** - RAG v2 search across documents with AI filtering
3. **pluggedin_send_notification** - Send notifications
4. **pluggedin_create_document** - (Coming Soon) Create AI-generated documents
## ⚡ Dynamic MCP Tools (8) - From Connected Servers:
1. **query** - Run read-only SQL queries
2. **generate_random_integer** - Generate secure random integers
...
The enhanced RAG v2 system allows MCP servers to create and search documents with full AI attribution:
# Search for documents created by specific AI models
pluggedin_rag_query({
"query": "system architecture",
"filters": {
"modelName": "Claude 3 Opus",
"source": "ai_generated",
"tags": ["technical"]
}
})
# Search across all document sources
pluggedin_rag_query({
"query": "deployment guide",
"filters": {
"dateFrom": "2024-01-01",
"visibility": "workspace"
}
})
# Future: Create AI-generated documents (Coming Soon)
pluggedin_create_document({
"title": "Analysis Report",
"content": "# Market Analysis\n\nDetailed findings...",
"format": "md",
"tags": ["analysis", "market"],
"metadata": {
"model": {
"name": "Claude 3 Opus",
"provider": "Anthropic"
}
}
})
# Install and run with npx (latest v1.0.0)
npx -y @pluggedin/pluggedin-mcp-proxy@latest --pluggedin-api-key YOUR_API_KEY
For existing installations, see our Migration Guide for detailed upgrade instructions.
# Quick upgrade
npx -y @pluggedin/pluggedin-mcp-proxy@1.0.0 --pluggedin-api-key YOUR_API_KEY
Add the following to your Claude Desktop configuration:
{
"mcpServers": {
"pluggedin": {
"command": "npx",
"args": ["-y", "@pluggedin/pluggedin-mcp-proxy@latest"],
"env": {
"PLUGGEDIN_API_KEY": "YOUR_API_KEY"
}
}
}
}
Add the following to your Cline configuration:
{
"mcpServers": {
"pluggedin": {
"command": "npx",
"args": ["-y", "@pluggedin/pluggedin-mcp-proxy@latest"],
"env": {
"PLUGGEDIN_API_KEY": "YOUR_API_KEY"
}
}
}
}
For Cursor, you can use command-line arguments instead of environment variables:
npx -y @pluggedin/pluggedin-mcp-proxy@latest --pluggedin-api-key YOUR_API_KEY
Variable | Description | Required | Default |
---|---|---|---|
PLUGGEDIN_API_KEY |
API key from plugged.in App | Yes | - |
PLUGGEDIN_API_BASE_URL |
Base URL for plugged.in App | No | https://plugged.in |
Command line arguments take precedence over environment variables:
npx -y @pluggedin/pluggedin-mcp-proxy@latest --pluggedin-api-key YOUR_API_KEY --pluggedin-api-base-url https://your-custom-url.com
Option | Description | Default |
---|---|---|
--transport <type> |
Transport type: stdio or streamable-http |
stdio |
--port <number> |
Port for Streamable HTTP server | 12006 |
--stateless |
Enable stateless mode for Streamable HTTP | false |
--require-api-auth |
Require API key for Streamable HTTP requests | false |
For a complete list of options:
npx -y @pluggedin/pluggedin-mcp-proxy@latest --help
The proxy can run as an HTTP server instead of STDIO, enabling web-based access and remote connections.
# Run as HTTP server on default port (12006)
npx -y @pluggedin/pluggedin-mcp-proxy@latest --transport streamable-http --pluggedin-api-key YOUR_API_KEY
# Custom port
npx -y @pluggedin/pluggedin-mcp-proxy@latest --transport streamable-http --port 8080 --pluggedin-api-key YOUR_API_KEY
# With authentication required
npx -y @pluggedin/pluggedin-mcp-proxy@latest --transport streamable-http --require-api-auth --pluggedin-api-key YOUR_API_KEY
# Stateless mode (new session per request)
npx -y @pluggedin/pluggedin-mcp-proxy@latest --transport streamable-http --stateless --pluggedin-api-key YOUR_API_KEY
POST /mcp
- Send MCP messagesGET /mcp
- Server-sent events stream (optional)DELETE /mcp
- Terminate sessionGET /health
- Health check endpointIn stateful mode (default), use the mcp-session-id
header to maintain sessions:
# First request creates a session
curl -X POST http://localhost:12006/mcp \
-H "Content-Type: application/json" \
-H "Accept: application/json, text/event-stream" \
-d '{"jsonrpc":"2.0","method":"tools/list","id":1}'
# Subsequent requests use the same session
curl -X POST http://localhost:12006/mcp \
-H "Content-Type: application/json" \
-H "Accept: application/json, text/event-stream" \
-H "mcp-session-id: YOUR_SESSION_ID" \
-d '{"jsonrpc":"2.0","method":"tools/call","params":{"name":"tool_name"},"id":2}'
When using --require-api-auth
, include your API key as a Bearer token:
curl -X POST http://localhost:12006/mcp \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-H "Accept: application/json, text/event-stream" \
-d '{"jsonrpc":"2.0","method":"ping","id":1}'
You can also build and run the proxy server using Docker.
Ensure you have Docker installed and running. Navigate to the pluggedin-mcp
directory and run:
docker build -t pluggedin-mcp-proxy:latest .
A .dockerignore
file is included to optimize the build context.
Run the container in STDIO mode for MCP Inspector testing:
docker run -it --rm \
-e PLUGGEDIN_API_KEY="YOUR_API_KEY" \
-e PLUGGEDIN_API_BASE_URL="YOUR_API_BASE_URL" \
--name pluggedin-mcp-container \
pluggedin-mcp-proxy:latest
Run the container as an HTTP server:
docker run -d --rm \
-e PLUGGEDIN_API_KEY="YOUR_API_KEY" \
-e PLUGGEDIN_API_BASE_URL="YOUR_API_BASE_URL" \
-p 12006:12006 \
--name pluggedin-mcp-http \
pluggedin-mcp-proxy:latest \
--transport streamable-http --port 12006
Replace YOUR_API_KEY
and YOUR_API_BASE_URL
(if not using the default https://plugged.in
).
While the container is running, you can connect to it using the MCP Inspector:
npx @modelcontextprotocol/inspector docker://pluggedin-mcp-container
This will connect to the standard input/output of the running container.
Press Ctrl+C
in the terminal where docker run
is executing. The --rm
flag ensures the container is removed automatically upon stopping.
The plugged.in MCP Proxy Server acts as a bridge between MCP clients and multiple underlying MCP servers:
pluggedin_discover_tools
):
force_refresh=true
, runs discovery in background while showing current toolstools/list
: Fetches from /api/tools
(includes static + dynamic tools)resources/list
: Fetches from /api/resources
resource-templates/list
: Fetches from /api/resource-templates
prompts/list
: Fetches from /api/prompts
and /api/custom-instructions
, merges resultstools/call
: Parses prefix from tool name, looks up server in internal mapresources/read
: Calls /api/resolve/resource?uri=...
to get server detailsprompts/get
: Checks for custom instruction prefix or calls /api/resolve/prompt?name=...
The plugged.in MCP Proxy implements comprehensive security measures to protect your system and data:
.env
files with proper handling of quotes and multiline valuesexecFile()
instead of exec()
to prevent shell injectionnode
, npx
- Node.js commandspython
, python3
- Python commandsuv
, uvx
, uvenv
- UV Python toolsA dedicated security-utils.ts
module provides:
For detailed security implementation, see SECURITY.md.
The plugged.in MCP Proxy Server is designed to work seamlessly with the plugged.in App, which provides:
Contributions are welcome! Please feel free to submit a Pull Request.
/health
endpoint for service monitoringSee Release Notes for complete details.
Tests are included for development purposes but are excluded from Docker builds to minimize the container footprint.
# Run tests locally
npm test
# or
./scripts/test-local.sh
# Run tests in watch mode
npm run test:watch
# Run tests with UI
npm run test:ui
The Docker image is optimized for minimal footprint:
# Build optimized Docker image
docker build -t pluggedin-mcp .
# Check image size
docker images pluggedin-mcp
This project is licensed under the MIT License - see the LICENSE file for details.
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "pluggedin": { "command": "npx", "args": [ "-y", "@pluggedin/pluggedin-mcp-proxy@latest" ], "env": { "PLUGGEDIN_API_KEY": "<YOUR_API_KEY>" } } } }
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.