by rember
Create flashcards from chats, PDFs, and other notes using the Model Context Protocol for spaced‑repetition learning.
Rember Mcp enables AI assistants (e.g., Claude) to generate flashcards on the fly by calling a simple MCP server. It forwards notes to the Rember API, which creates flashcards that are then stored in a user's Rember account.
Run the server with npx, providing your Rember API key. Configure the MCP client (such as Claude Desktop) to point to the server name rember
. Once set up, users can ask the assistant to "help me remember this" or "create flashcards" and the tool will handle the rest.
create_flashcards
tool that accepts a list of notes and creates flashcards via the Rember API.Q: How do I supply my API key?
A: Include it in the command arguments (--api-key=YOUR_REMBER_API_KEY
) or set the API_KEY
environment variable.
Q: What happens if I exceed my monthly flashcard limit? A: The server will inform the user about the Rember Pro subscription and provide the relevant URL.
Q: Can I see how many flashcards were created? A: The Rember API does not return that number; the tool reports the count of created "rembs" instead.
Q: Is telemetry enabled? A: Not currently; future versions may add observability.
Allow Claude to create flashcards for you with the official Model Context Protocol (MCP) for Rember. Rember helps you study and remember anything you care about by scheduling spaced repetition reviews.
Features and examples:
To run the Rember MCP server using npx
, use the following command:
npx -y @getrember/mcp --api-key=YOUR_REMBER_API_KEY
Make sure to replace YOUR_REMBER_API_KEY
with your actual Rember api key, which you can find in your Settings page. The API key should follow the format rember_
followed by 32 random characters.
Add the following to your claude_desktop_config.json
. See here for more details.
{
"mcpServers": {
"rember": {
"command": "npx",
"args": ["-y", "@getrember/mcp", "--api-key=YOUR_REMBER_API_KEY"]
}
}
}
create_flashcards
: Create flashcards with AI. This tool takes a list of notes from Claude, it calls the Rember API to generate a few flashcards for each note. After learning something new in your chat with Claude, you can ask "help me remember this" or "create a few flashcards" or "add to Rember".Here's a collection of lessons we learned while developing the Rember MCP server:
Set up logging to stderr
as early as possible, it's essential for debugging
Create a simple MCP tool first and verify Claude can call it properly
Invest time in iterating on the tool description:
Use the tool call response strategically, it's not shown directly to users but interpreted by Claude:
Implement retries for transient errors with suitable timeouts
We collected enough edge cases that testing manually on Claude Desktop (our main target MCP client) became cumbersome. We created a suite of unit tests by simulating Claude Desktop behavior by calling the Claude API with the system prompt from claude.ai. In the current iteration, each test simulates a chat with Claude Desktop for manual inspection and includes a few simple assertions
What's missing:
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "rember": { "command": "npx", "args": [ "-y", "@getrember/mcp" ], "env": { "API_KEY": "<YOUR_API_KEY>" } } } }
Discover more MCP servers with similar functionality and use cases
by topoteretes
Enables AI agents to store, retrieve, and reason over past conversations, documents, images, and audio transcriptions by loading data into graph and vector databases with minimal code.
by basicmachines-co
Basic Memory is a local-first knowledge management system that allows users to build a persistent semantic graph from conversations with AI assistants. It addresses the ephemeral nature of most LLM interactions by providing a structured, bi-directional knowledge base that both humans and LLMs can read and write to.
by smithery-ai
mcp-obsidian is a connector that allows Claude Desktop to read and search an Obsidian vault or any directory containing Markdown notes.
by qdrant
Provides a semantic memory layer on top of the Qdrant vector search engine, enabling storage and retrieval of information via the Model Context Protocol.
by GreatScottyMac
A database‑backed MCP server that stores project decisions, progress, architecture, custom data, and vector embeddings, allowing AI assistants in IDEs to retrieve precise, up‑to‑date context for generation tasks.
by StevenStavrakis
Enables AI assistants to read, create, edit, move, delete, and organize notes and tags within an Obsidian vault.
by mem0ai
Provides tools to store, retrieve, and semantically search coding preferences via an SSE endpoint for integration with MCP clients.
by graphlit
Enables integration between MCP clients and the Graphlit platform, providing ingestion, retrieval, RAG, and publishing capabilities across a wide range of data sources and tools.
by chroma-core
Provides vector, full‑text, and metadata‑based retrieval powered by Chroma for LLM applications, supporting in‑memory, persistent, HTTP, and cloud clients as well as multiple embedding functions.