by membranehq
Provides actions for connected integrations on Integration.app as tools that can be invoked by AI agents or other MCP‑compatible clients.
MCP Server exposes the tools (actions) of every integration linked to an Integration.app account through the Model Context Protocol transports. Clients can retrieve the full tool catalog or dynamically enable a subset, then invoke those tools during a conversation.
git clone https://github.com/integration-app/mcp-server.git
cd mcp-server
npm install
npm run build
npm run dev
# server listens on http://localhost:3000
import { Client } from '@modelcontextprotocol/sdk/client/index.js';
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp.js';
const client = new Client({ name: 'my-client', version: '1.0.0' });
const transport = new StreamableHTTPClientTransport(
new URL('https://<HOSTED_MCP_SERVER_URL>/mcp'),
{ requestInit: { headers: { Authorization: `Bearer ${ACCESS_TOKEN}` } } }
);
await client.connect(transport);
/mcp
) returns all tools; dynamic mode (/mcp?mode=dynamic
) returns only enable-tools
for selective activation.x-chat-id
header to keep a persistent session (experimental)./sse
) and recommended Streamable HTTP (/mcp
).apps
query parameter.x-chat-id
header.Q: Which transport should I use?
/mcp
). SSE is deprecated as of Nov 2024.Q: How do I limit the tools returned?
?mode=dynamic
to get only the enable-tools
tool, then call it with the list of desired tools. You can also filter by integration with ?apps=google-calendar,google-docs
.Q: Do I need to set up a database?
Q: Can I run the server in production?
docker build -t integration-app-mcp-server .
) and run it (docker run -p 3000:3000 integration-app-mcp-server
).Q: How are chat sessions handled?
x-chat-id
header on each request. The server will map it to an internal session UUID, allowing stateful conversations across calls.The Integration App MCP Server is a Model Context Protocol (MCP) server, it provides actions for connected integrations on Integration.app membrane as tools.
Here's our official AI Agent Example that shows you how to use this MCP server in your application.
git clone https://github.com/integration-app/mcp-server.git
cd mcp-server
npm install
npm run build
To run the development server locally, start it with:
npm run dev
The server will be live at http://localhost:3000
⚡️
# Run the server in test mode
npm run start:test
# then run tests
npm test
Deploy your own instance of this MCP server to any cloud hosting service of your choice.
The project includes a Dockerfile for easy containerized deployment.
docker build -t integration-app-mcp-server .
docker run -p 3000:3000 integration-app-mcp-server
This MCP server support two transports:
Transport | Endpoint | Status |
---|---|---|
SSE (Server‑Sent Events) | /sse |
🔴 Deprecated — deprecated as of November 5, 2024 in MCP spec |
HTTP (Streamable HTTP) | /mcp |
🟢 Recommended — replaces SSE and supports bidirectional streaming |
Provide an Integration.app access token via query or Authorization
header:
?token=ACCESS_TOKEN
Authorization: Bearer ACCESS_TOKEN
SSE (Deprecated)
await client.connect(
new SSEClientTransport(
new URL(
`https://<HOSTED_MCP_SERVER_URL>/sse`
)
{
requestInit: {
headers: {
Authorization: `Bearer ${ACCESS_TOKEN}`,
},
},
}
)
);
Streamable HTTP (Recommended)
await client.connect(
new StreamableHTTPClientTransport(
new URL(`https://<HOSTED_MCP_SERVER_URL>/mcp`)
{
requestInit: {
headers: {
Authorization: `Bearer ${ACCESS_TOKEN}`,
},
},
}
)
);
By default, the MCP server runs in static mode, which means it returns all available tools (actions) for all connected integrations.
With dynamic mode (?mode=dynamic
), the server will only return one tool: enable-tools
. You can use this tool to selectively enable the tools you actually need for that session.
In dynamic mode, your implementation should figure out which tools are most relevant to the user's query. Once you've identified them, prompt the LLM to call the enable-tools
tool with the appropriate list.
Want to see how this works in practice? Check out our AI Agent Example.
import { Client } from '@modelcontextprotocol/sdk/client/index.js';
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp.js';
const client = new Client({
name: 'example-integration-app-mcp-client',
version: '1.0.0',
});
const transport = new StreamableHTTPClientTransport(
new URL(`https://<HOSTED_MCP_SERVER_URL>/mcp?mode=dynamic`),
{
requestInit: {
headers: {
Authorization: `Bearer ${ACCESS_TOKEN}`,
},
},
}
);
await client.connect(transport);
await client.callTool({
name: 'enable-tools',
arguments: {
tools: ['gmail-send-email', 'gmail-read-email'],
},
});
In static mode, the MCP server fetches tools from all active connections associated with the provided token.
You can choose to only fetch tools for a specific integration by passing the apps
query parameter: /mcp?apps=google-calendar,google-docs
The MCP server (streamable-http transport only) supports persistent chat sessions. Include an x-chat-id
header in your requests to automatically track sessions for that specific chat. This is an experimental feature that we provide in addition to standard MCP sessions.
Starting a new chat session:
POST /mcp
Authorization: Bearer YOUR_ACCESS_TOKEN
x-chat-id: my-awesome-chat-123
Retrieving your chat sessions:
GET /mcp/sessions
Authorization: Bearer YOUR_ACCESS_TOKEN
Response:
{
"my-awesome-chat-123": "session-uuid-1",
"another-chat-456": "session-uuid-2"
}
This feature lets you use same session for a conversation. Check out our AI Agent Example to see how this works in practice.
To use this server with Cursor, update the ~/.cursor/mcp.json
file:
{
"mcpServers": {
"integration-app": {
"url": "https://<HOSTED_MCP_SERVER_URL>/sse?token={ACCESS_TOKEN}"
}
}
}
Restart Cursor for the changes to take effect.
To use this server with Claude, update the config file (Settings > Developer > Edit Config):
{
"mcpServers": {
"integration-app": {
"url": "https://<HOSTED_MCP_SERVER_URL>/sse?token={ACCESS_TOKEN}"
}
}
}
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.