by unibaseio
Membase-MCP is a lightweight decentralized memory gateway that connects AI agents to Membase for persistent, verifiable multi-session memory.
Membase-MCP is a server that enables AI agents to seamlessly integrate with the Membase protocol. Membase itself is a decentralized memory layer for AI agents, powered by Unibase, providing secure, persistent storage for conversation history, interaction records, and knowledge. This ensures agent continuity, personalization, and traceability.
To use Membase-MCP, you first need to clone the repository and run the server:
git clone https://github.com/unibaseio/membase-mcp.git
cd membase-mcp
uv run src/membase_mcp/server.py
You will also need to configure environment variables such as MEMBASE_ACCOUNT
, MEMBASE_CONVERSATION_ID
, and MEMBASE_ID
. The project provides configuration examples for platforms like Claude, Windsurf, Cursor, and Cline, showing how to set up the mcpServers
.
Once set up, you can call functions within your LLM chat to interact with Membase. Examples include getting and switching conversation IDs, and saving/getting messages.
get_conversation_id
, switch_conversation
, save_message
, and get_messages
.Membase-MCP is ideal for scenarios where AI agents require:
Q: Where can messages or memories be visited? A: Messages or memories can be visited at: https://testnet.hub.membase.io/
Q: What environment variables are required?
A: MEMBASE_ACCOUNT
, MEMBASE_CONVERSATION_ID
, and MEMBASE_ID
are required environment variables.
Membase is the first decentralized memory layer for AI agents, powered by Unibase. It provides secure, persistent storage for conversation history, interaction records, and knowledge — ensuring agent continuity, personalization, and traceability.
The Membase-MCP Server enables seamless integration with the Membase protocol, allowing agents to upload and retrieve memory from the Unibase DA network for decentralized, verifiable storage.
Messages or memoiries can be visit at: https://testnet.hub.membase.io/
git clone https://github.com/unibaseio/membase-mcp.git
cd membase-mcp
uv run src/membase_mcp/server.py
{
"mcpServers": {
"membase": {
"command": "uv",
"args": [
"--directory",
"path/to/membase-mcp",
"run",
"src/membase_mcp/server.py"
],
"env": {
"MEMBASE_ACCOUNT": "your account, 0x...",
"MEMBASE_CONVERSATION_ID": "your conversation id, should be unique",
"MEMBASE_ID": "your sub account, any string"
}
}
}
}
call functions in llm chat
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by topoteretes
Enables AI agents to store, retrieve, and reason over past conversations, documents, images, and audio transcriptions by loading data into graph and vector databases with minimal code.
by basicmachines-co
Basic Memory is a local-first knowledge management system that allows users to build a persistent semantic graph from conversations with AI assistants. It addresses the ephemeral nature of most LLM interactions by providing a structured, bi-directional knowledge base that both humans and LLMs can read and write to.
by smithery-ai
mcp-obsidian is a connector that allows Claude Desktop to read and search an Obsidian vault or any directory containing Markdown notes.
by qdrant
Provides a semantic memory layer on top of the Qdrant vector search engine, enabling storage and retrieval of information via the Model Context Protocol.
by GreatScottyMac
A database‑backed MCP server that stores project decisions, progress, architecture, custom data, and vector embeddings, allowing AI assistants in IDEs to retrieve precise, up‑to‑date context for generation tasks.
by StevenStavrakis
Enables AI assistants to read, create, edit, move, delete, and organize notes and tags within an Obsidian vault.
by mem0ai
Provides tools to store, retrieve, and semantically search coding preferences via an SSE endpoint for integration with MCP clients.
by graphlit
Enables integration between MCP clients and the Graphlit platform, providing ingestion, retrieval, RAG, and publishing capabilities across a wide range of data sources and tools.
by chroma-core
Provides vector, full‑text, and metadata‑based retrieval powered by Chroma for LLM applications, supporting in‑memory, persistent, HTTP, and cloud clients as well as multiple embedding functions.