by cbinsights
Provides an interface for developers to interact with CB Insights ChatCBI LLM through AI Agents.
The server exposes a set of HTTP endpoints that let developers send messages to the ChatCBI large language model and receive structured responses, including the conversation ID, related content, source citations, and suggested follow‑up prompts.
uv sync
(or the equivalent uv install
)..env.example
file to .env
and set:
CBI_CLIENT_ID
and CBI_CLIENT_SECRET
for API authentication.CBI_MCP_PORT
(default 8000) and CBI_MCP_TIMEOUT
.mcp install server.py
or run manually:
uv --directory /path/to/cloned/cbi-mcp-server run server.py
/chatcbi
with message
and optional chatID
fields. The response contains chatID
, message
, RelatedContent
, Sources
, and Suggestions
.chatID
to keep context across multiple calls.uv
manager.mcp dev server.py
).CBI_MCP_PORT
in the .env
file to any available port.chatID
? The server creates a new ChatCBI session and returns its chatID
in the response.mcp dev server.py
.CBI_MCP_TIMEOUT
(seconds).The CBI MCP Server provides an interface for developers to interact with CB Insights ChatCBI LLM through AI Agents.
message
:chatID
: (optional) The unique id of an existing ChatCBI session. Used for continuity in a conversation. If not provided, a new ChatCBI session will be createdchatID
: Unique id of current ChatCBI sessionmessage
: ChatCBI message generated in response to the message send in the input.RelatedContent
: Content that is related to the content returnedSources
: Supporting sources for the message content returnedSuggestions
Suggested prompts to further explore the subject matterThe CBI MCP Server uses uv to manage the project.
The default port is 8000
, but can be modified by updating the CBI_MCP_PORT
environment variable in the .env
file.
The timeout for requests can also be modified via the CBI_MCP_TIMEOUT
variable in the .env
file.
Documentation on how CB Insights APIs are authenticated can be found here
The server uses the CBI_CLIENT_ID
and CBI_CLIENT_SECRET
environment variables set in the .env
file to authorize requests.
Update the claude_desktop_config.json
file using the following command:
mcp install server.py
This will add the following configuration:
{
"mcpServers": {
"cbi-mcp-server": {
"command": "/path/to/.local/bin/uv",
"args": [
"--directory",
"/path/to/cloned/cbi-mcp-server",
"run",
"server.py"
]
}
}
}
The inspector can be used to test/debug your server.
mcp dev server.py
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "cbi-mcp-server": { "command": "/path/to/.local/bin/uv", "args": [ "--directory", "/path/to/cloned/cbi-mcp-server", "run", "server.py" ], "env": { "CBI_CLIENT_ID": "<YOUR_CLIENT_ID>", "CBI_CLIENT_SECRET": "<YOUR_CLIENT_SECRET>", "CBI_MCP_PORT": "8000", "CBI_MCP_TIMEOUT": "<TIMEOUT_SECONDS>" } } } }
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.