by quickchatai
Provides a command‑line MCP server that proxies Quickchat AI agents to any Model Context Protocol‑compatible client, enabling seamless integration with tools like Claude Desktop, Cursor, VS Code, Windsurf and more.
The Quickchat AI MCP Server acts as a bridge between Quickchat AI agents and any AI application that supports the Model Context Protocol. By running a small Python‑based server, developers can expose their custom Quickchat agents as MCP services, allowing tools such as Claude Desktop, Cursor, VS Code extensions, and other MCP‑compatible clients to invoke the agent directly.
uv
package manager:
curl -LsSf https://astral.sh/uv/install.sh | sh
SCENARIO_ID
and API_KEY
.uvx quickchat-ai-mcp
starts the MCP service.uv run
commands.Q: Do I need to expose my Quickchat API key to users?
A: No. Turn the Require API key toggle off on the MCP page and share a configuration snippet that only contains SCENARIO_ID
.
Q: Which package manager should I use?
A: The project recommends uv
for installation and execution, but any Python environment capable of running the quickchat-ai-mcp
package will work.
Q: Can I run the server in production?
A: Yes. Deploy the same command (uvx quickchat-ai-mcp
) on a server or container; the Quickchat dashboard handles updates to the agent logic.
Q: How do I debug issues?
A: Use the MCP inspector (uv run mcp dev src/__main__.py
) or run the server with the debug command shown in the README.
Q: What clients are supported? A: Any client that implements the Model Context Protocol, including Claude Desktop, Cursor, VS Code extensions, Windsurf, and future integrations listed on the MCP feature‑support matrix.
The Quickchat AI MCP (Model Context Protocol) server allows you to let anyone plug in your Quickchat AI Agent into their favourite AI app such as Claude Desktop, Cursor, VS Code, Windsurf and more.
Install uv
using:
curl -LsSf https://astral.sh/uv/install.sh | sh
or read more here.
Go to Settings > Developer > Edit
Config. Open the claude_desktop_config.json file in a text editor. If you're just starting out, the file is going to look like this:
{
"mcpServers": {}
}
This is where you can define all the MCPs your Claude Desktop has access to. Here is how you add your Quickchat AI MCP:
{
"mcpServers": {
"< QUICKCHAT AI MCP NAME >": {
"command": "uvx",
"args": ["quickchat-ai-mcp"],
"env": {
"SCENARIO_ID": "< QUICKCHAT AI SCENARIO ID >",
"API_KEY": "< QUICKCHAT AI API KEY >"
}
}
}
}
Go to the Quickchat AI app > MCP > Integration
to find the above snippet with the values of MCP Name, SCENARIO_ID and API_KEY filled out.
Go to Settings > Cursor Settings > MCP > Add new global MCP server
and include the Quickchat AI MCP snippet:
{
"mcpServers": {
"< QUICKCHAT AI MCP NAME >": {
"command": "uvx",
"args": ["quickchat-ai-mcp"],
"env": {
"SCENARIO_ID": "< QUICKCHAT AI SCENARIO ID >",
"API_KEY": "< QUICKCHAT AI API KEY >"
}
}
}
}
As before, you can find values for MCP Name, SCENARIO_ID and API_KEY at Quickchat AI app > MCP > Integration
.
Other AI apps will most likely require the same configuration but the actual steps to include it in the App itself will be different. We will be expanding this README as we go along.
⛔️ Do not publish your Quickchat API key to your users!
Once you're ready to let other users connect your Quickchat AI MCP to their AI apps, share configuration snippet with them! However, you need to make sure they can use your Quickchat AI MCP without your Quickchat API key. Here is how to do that:
{
"mcpServers": {
"< QUICKCHAT AI MCP NAME >": {
"command": "uvx",
"args": ["quickchat-ai-mcp"],
"env": {
"SCENARIO_ID": "< QUICKCHAT AI SCENARIO ID >"
}
}
}
}
uv run mcp dev src/__main__.py
Use the following JSON configuration:
{
"mcpServers": {
"< QUICKCHAT AI MCP NAME >": {
"command": "uv",
"args": [
"run",
"--with",
"mcp[cli]",
"--with",
"requests",
"mcp",
"run",
"< YOUR PATH>/quickchat-ai-mcp/src/__main__.py"
],
"env": {
"SCENARIO_ID": "< QUICKCHAT AI SCENARIO ID >",
"API_KEY": "< QUICKCHAT AI API KEY >"
}
}
}
}
Make sure your code is properly formatted and all tests are passing:
ruff check --fix
ruff format
uv run pytest
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "quickchat-ai-mcp": { "command": "uvx", "args": [ "quickchat-ai-mcp" ], "env": { "SCENARIO_ID": "<YOUR_SCENARIO_ID>", "API_KEY": "<YOUR_API_KEY>" } } } }
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.