by ragieai
Enables AI models to retrieve relevant information from a Ragie knowledge base via a Model Context Protocol server exposing a single `retrieve` tool.
Provides a Model Context Protocol (MCP) server that lets AI models query a Ragie knowledge base. The server runs on Node.js, reads MCP messages from stdio, and returns matching content chunks.
RAGIE_API_KEY=your_api_key
.RAGIE_API_KEY=your_api_key npx @ragieai/mcp-server
--description, -d <text>
– custom tool description.--partition, -p <id>
– target a specific Ragie partition.mcp.json
or claude_desktop_config.json
) that points to the npx @ragieai/mcp-server
command and supplies the API key.retrieve
tool with parameters query
, topK
(default 8), rerank
(default true), and recencyBias
(default false).@modelcontextprotocol/sdk
, ragie
, and zod
for validation.npx
– no global install required.Q: What Node.js version is required? A: Node.js ≥ 18.
Q: Do I need to build the project before running?
A: No. The package can be executed directly via npx
.
Q: How do I specify a different Ragie partition?
A: Use the --partition
(or -p
) CLI flag.
Q: Can I change the tool description shown to the model?
A: Yes, with the --description
(or -d
) flag.
Q: What environment variable must be set?
A: RAGIE_API_KEY
– your Ragie authentication key.
Q: How do I integrate with Cursor?
A: Add an mcp.json
file in the project (.cursor/mcp.json
) or globally (~/.cursor/mcp.json
) containing the server configuration.
Q: Is the server compatible with Claude Desktop?
A: Yes – place a claude_desktop_config.json
in the appropriate application data folder with the same configuration.
A Model Context Protocol (MCP) server that provides access to Ragie's knowledge base retrieval capabilities.
This server implements the Model Context Protocol to enable AI models to retrieve information from a Ragie knowledge base. It provides a single tool called "retrieve" that allows querying the knowledge base for relevant information.
The server requires the following environment variable:
RAGIE_API_KEY
(required): Your Ragie API authentication keyThe server will start and listen on stdio for MCP protocol messages.
Install and run the server with npx:
RAGIE_API_KEY=your_api_key npx @ragieai/mcp-server
The server supports the following command line options:
--description, -d <text>
: Override the default tool description with custom text--partition, -p <id>
: Specify the Ragie partition ID to queryExamples:
# With custom description
RAGIE_API_KEY=your_api_key npx @ragieai/mcp-server --description "Search the company knowledge base for information"
# With partition specified
RAGIE_API_KEY=your_api_key npx @ragieai/mcp-server --partition your_partition_id
# Using both options
RAGIE_API_KEY=your_api_key npx @ragieai/mcp-server --description "Search the company knowledge base" --partition your_partition_id
To use this MCP server with Cursor:
mcp.json
.cursor/mcp.json
file in your project directory. This allows you to define MCP servers that are only available within that specific project.~/.cursor/mcp.json
file in your home directory. This makes MCP servers available in all your Cursor workspaces.Example mcp.json
:
{
"mcpServers": {
"ragie": {
"command": "npx",
"args": [
"-y",
"@ragieai/mcp-server",
"--partition",
"optional_partition_id"
],
"env": {
"RAGIE_API_KEY": "your_api_key"
}
}
}
}
ragie-mcp.sh
on your system:#!/usr/bin/env bash
export RAGIE_API_KEY="your_api_key"
npx -y @ragieai/mcp-server --partition optional_partition_id
Give the file execute permissions: chmod +x ragie-mcp.sh
Add the MCP server script by going to Settings -> Cursor Settings -> MCP Servers in the Cursor UI.
Replace your_api_key
with your actual Ragie API key and optionally set the partition ID if needed.
To use this MCP server with Claude desktop:
claude_desktop_config.json
:~/Library/Application Support/Claude/claude_desktop_config.json
%APPDATA%/Claude/claude_desktop_config.json
Example claude_desktop_config.json
:
{
"mcpServers": {
"ragie": {
"command": "npx",
"args": [
"-y",
"@ragieai/mcp-server",
"--partition",
"optional_partition_id"
],
"env": {
"RAGIE_API_KEY": "your_api_key"
}
}
}
}
Replace your_api_key
with your actual Ragie API key and optionally set the partition ID if needed.
The Ragie retrieval tool will now be available in your Claude desktop conversations.
The server provides a retrieve
tool that can be used to search the knowledge base. It accepts the following parameters:
query
(string): The search query to find relevant informationtopK
(number, optional, default: 8): The maximum number of results to returnrerank
(boolean, optional, default: true): Whether to try and find only the most relevant informationrecencyBias
(boolean, optional, default: false): Whether to favor results towards more recent informationThe tool returns:
This project is written in TypeScript and uses the following main dependencies:
@modelcontextprotocol/sdk
: For implementing the MCP serverragie
: For interacting with the Ragie APIzod
: For runtime type validationRunning the server in dev mode:
RAGIE_API_KEY=your_api_key npm run dev -- --partition optional_partition_id
Building the project:
npm run build
MIT License - See LICENSE.txt for details.
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "ragie": { "command": "npx", "args": [ "-y", "@ragieai/mcp-server" ], "env": { "RAGIE_API_KEY": "<YOUR_API_KEY>" } } } }
Discover more MCP servers with similar functionality and use cases
by topoteretes
Enables AI agents to store, retrieve, and reason over past conversations, documents, images, and audio transcriptions by loading data into graph and vector databases with minimal code.
by basicmachines-co
Basic Memory is a local-first knowledge management system that allows users to build a persistent semantic graph from conversations with AI assistants. It addresses the ephemeral nature of most LLM interactions by providing a structured, bi-directional knowledge base that both humans and LLMs can read and write to.
by smithery-ai
mcp-obsidian is a connector that allows Claude Desktop to read and search an Obsidian vault or any directory containing Markdown notes.
by qdrant
Provides a semantic memory layer on top of the Qdrant vector search engine, enabling storage and retrieval of information via the Model Context Protocol.
by GreatScottyMac
A database‑backed MCP server that stores project decisions, progress, architecture, custom data, and vector embeddings, allowing AI assistants in IDEs to retrieve precise, up‑to‑date context for generation tasks.
by StevenStavrakis
Enables AI assistants to read, create, edit, move, delete, and organize notes and tags within an Obsidian vault.
by mem0ai
Provides tools to store, retrieve, and semantically search coding preferences via an SSE endpoint for integration with MCP clients.
by graphlit
Enables integration between MCP clients and the Graphlit platform, providing ingestion, retrieval, RAG, and publishing capabilities across a wide range of data sources and tools.
by chroma-core
Provides vector, full‑text, and metadata‑based retrieval powered by Chroma for LLM applications, supporting in‑memory, persistent, HTTP, and cloud clients as well as multiple embedding functions.