by inkeep
Provides RAG-powered search over Inkeep product documentation via the Model Context Protocol, enabling clients to retrieve relevant content.
Mcp Server Python offers a Model Context Protocol (MCP) server that connects to Inkeep's API to perform retrieval‑augmented generation (RAG) on product documentation and other knowledge assets. It allows client applications, such as Claude Desktop, to query Inkeep content in a conversational manner.
uv
.uv pip install -r pyproject.toml
.claude_desktop_config.json
that points to the server's directory, command, and required environment variables.uv
for virtual environments and dependency management.Q: Do I need an Inkeep account? A: Yes, an active Inkeep account and a project with API access are required.
Q: Which command should I use to run the server?
A: The recommended command is uv run -m inkeep_mcp_server
within the project directory, as shown in the provided MCP client configuration.
Q: Can I use a different language model?
A: The server defaults to the inkeep-rag
model, but you can change INKEEP_MCP_TOOL_NAME
and other environment variables to target a different model if supported by Inkeep.
Q: How do I set the path to uv
?
A: On macOS/Linux run which uv
; on Windows run where uv
, then use the full path in the command
field if necessary.
Q: Is there a Docker image available? A: The repository does not currently provide a Docker setup; you can containerize it manually by copying the repository and installing dependencies inside a container.
Inkeep MCP Server powered by your docs and product content.
git clone https://github.com/inkeep/mcp-server-python.git
cd mcp-server-python
uv venv
uv pip install -r pyproject.toml
Note the full path of the project, referred to as <YOUR_INKEEP_MCP_SERVER_ABSOLUTE_PATH>
in a later step.
We'll refer to this API key as the <YOUR_INKEEP_API_KEY>
in later steps.
Follow the steps in this guide to setup Claude Dekstop.
In your claude_desktop_config.json
file, add the following entry to mcpServers
.
{
"mcpServers": {
"inkeep-mcp-server": {
"command": "uv",
"args": [
"--directory",
"<YOUR_INKEEP_MCP_SERVER_ABSOLUTE_PATH>",
"run",
"-m",
"inkeep_mcp_server"
],
"env": {
"INKEEP_API_BASE_URL": "https://api.inkeep.com/v1",
"INKEEP_API_KEY": "<YOUR_INKEEP_API_KEY>",
"INKEEP_API_MODEL": "inkeep-rag",
"INKEEP_MCP_TOOL_NAME": "search-product-content",
"INKEEP_MCP_TOOL_DESCRIPTION": "Retrieves product documentation about Inkeep. The query should be framed as a conversational question about Inkeep."
}
},
}
}
You may need to put the full path to the uv
executable in the command field. You can get this by running which uv
on MacOS/Linux or where uv
on Windows.
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "inkeep-mcp-server": { "command": "uv", "args": [ "--directory", "<YOUR_INKEEP_MCP_SERVER_ABSOLUTE_PATH>", "run", "-m", "inkeep_mcp_server" ], "env": { "INKEEP_API_BASE_URL": "https://api.inkeep.com/v1", "INKEEP_API_KEY": "<YOUR_INKEEP_API_KEY>", "INKEEP_API_MODEL": "inkeep-rag", "INKEEP_MCP_TOOL_NAME": "search-product-content", "INKEEP_MCP_TOOL_DESCRIPTION": "Retrieves product documentation about Inkeep. The query should be framed as a conversational question about Inkeep." } } } }
Discover more MCP servers with similar functionality and use cases
by topoteretes
Enables AI agents to store, retrieve, and reason over past conversations, documents, images, and audio transcriptions by loading data into graph and vector databases with minimal code.
by basicmachines-co
Basic Memory is a local-first knowledge management system that allows users to build a persistent semantic graph from conversations with AI assistants. It addresses the ephemeral nature of most LLM interactions by providing a structured, bi-directional knowledge base that both humans and LLMs can read and write to.
by smithery-ai
mcp-obsidian is a connector that allows Claude Desktop to read and search an Obsidian vault or any directory containing Markdown notes.
by qdrant
Provides a semantic memory layer on top of the Qdrant vector search engine, enabling storage and retrieval of information via the Model Context Protocol.
by GreatScottyMac
A database‑backed MCP server that stores project decisions, progress, architecture, custom data, and vector embeddings, allowing AI assistants in IDEs to retrieve precise, up‑to‑date context for generation tasks.
by StevenStavrakis
Enables AI assistants to read, create, edit, move, delete, and organize notes and tags within an Obsidian vault.
by mem0ai
Provides tools to store, retrieve, and semantically search coding preferences via an SSE endpoint for integration with MCP clients.
by graphlit
Enables integration between MCP clients and the Graphlit platform, providing ingestion, retrieval, RAG, and publishing capabilities across a wide range of data sources and tools.
by chroma-core
Provides vector, full‑text, and metadata‑based retrieval powered by Chroma for LLM applications, supporting in‑memory, persistent, HTTP, and cloud clients as well as multiple embedding functions.