by mem0ai
Provides tools to store, retrieve, and semantically search coding preferences via an SSE endpoint for integration with MCP clients.
Mem0 Mcp offers a lightweight server that enables agents to manage coding preferences—code snippets, implementation details, version information, documentation, and best‑practice guides—through the Model Context Protocol (MCP). The server persists these preferences using Mem0 and exposes them via a Server‑Sent Events (SSE) endpoint.
uv venv
.source .venv/bin/activate
.uv pip install -e .
..env
file (MEM0_API_KEY=your_api_key_here
).uv run main.py
(default on 0.0.0.0:8080
).http://0.0.0.0:8080/sse
and invoke the provided tools.--host
, --port
)./sse
) for real‑time interaction with MCP clients.Q: Which language/runtime is required?
A: Python (managed via uv
).
Q: How is authentication handled?
A: The server uses a Mem0 API key defined in the .env
file.
Q: Can I change the listening port?
A: Yes, run uv run main.py --host <host> --port <port>
.
Q: Is the server compatible with other MCP clients? A: Any client that can connect to an SSE endpoint and follow the MCP specification can interact with the tools provided.
This demonstrates a structured approach for using an MCP server with mem0 to manage coding preferences efficiently. The server can be used with Cursor and provides essential tools for storing, retrieving, and searching coding preferences.
uv
environment:uv venv
source .venv/bin/activate
uv
:# Install in editable mode from pyproject.toml
uv pip install -e .
.env
file in the root directory with your mem0 API key:MEM0_API_KEY=your_api_key_here
uv run main.py
http://0.0.0.0:8080/sse
Agent
mode.https://github.com/user-attachments/assets/56670550-fb11-4850-9905-692d3496231c
The server provides three main tools for managing code preferences:
add_coding_preference
: Store code snippets, implementation details, and coding patterns with comprehensive context including:
get_all_coding_preferences
: Retrieve all stored coding preferences to analyze patterns, review implementations, and ensure no relevant information is missed.
search_coding_preferences
: Semantically search through stored coding preferences to find relevant:
This implementation allows for a persistent coding preferences system that can be accessed via MCP. The SSE-based server can run as a process that agents connect to, use, and disconnect from whenever needed. This pattern fits well with "cloud-native" use cases where the server and clients can be decoupled processes on different nodes.
By default, the server runs on 0.0.0.0:8080 but is configurable with command line arguments like:
uv run main.py --host <your host> --port <your port>
The server exposes an SSE endpoint at /sse
that MCP clients can connect to for accessing the coding preferences management tools.
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by topoteretes
Enables AI agents to store, retrieve, and reason over past conversations, documents, images, and audio transcriptions by loading data into graph and vector databases with minimal code.
by basicmachines-co
Basic Memory is a local-first knowledge management system that allows users to build a persistent semantic graph from conversations with AI assistants. It addresses the ephemeral nature of most LLM interactions by providing a structured, bi-directional knowledge base that both humans and LLMs can read and write to.
by smithery-ai
mcp-obsidian is a connector that allows Claude Desktop to read and search an Obsidian vault or any directory containing Markdown notes.
by qdrant
Provides a semantic memory layer on top of the Qdrant vector search engine, enabling storage and retrieval of information via the Model Context Protocol.
by GreatScottyMac
A database‑backed MCP server that stores project decisions, progress, architecture, custom data, and vector embeddings, allowing AI assistants in IDEs to retrieve precise, up‑to‑date context for generation tasks.
by StevenStavrakis
Enables AI assistants to read, create, edit, move, delete, and organize notes and tags within an Obsidian vault.
by graphlit
Enables integration between MCP clients and the Graphlit platform, providing ingestion, retrieval, RAG, and publishing capabilities across a wide range of data sources and tools.
by chroma-core
Provides vector, full‑text, and metadata‑based retrieval powered by Chroma for LLM applications, supporting in‑memory, persistent, HTTP, and cloud clients as well as multiple embedding functions.
by andrea9293
MCP Documentation Server is a TypeScript-based server that provides local document management and AI-powered semantic search capabilities, designed to bridge the AI knowledge gap.