by smithery-ai
mcp-obsidian is a connector that allows Claude Desktop to read and search an Obsidian vault or any directory containing Markdown notes.
mcp-obsidian is a Model Context Protocol (MCP) connector designed to integrate Obsidian vaults or any directory containing Markdown notes with MCP clients like Claude Desktop. It enables these clients to access, read, and search through your Markdown notes, effectively extending their knowledge base.
mcp-obsidian can be installed and used in a few ways:
npx @smithery/cli install mcp-obsidian --client claude
in your terminal, then restart Claude Desktop..vscode/mcp.json
file within your workspace. This configuration specifies the command to run mcp-obsidian and prompts for the vault path.npm
installed? Yes, npm
is required for installing mcp-obsidian, especially when using npx
.This is a connector to allow Claude Desktop (or any MCP client) to read and search any directory containing Markdown notes (such as an Obsidian vault).
Make sure Claude Desktop and npm
is installed.
To install Obsidian Model Context Protocol for Claude Desktop automatically via Smithery:
npx @smithery/cli install mcp-obsidian --client claude
Then, restart Claude Desktop and you should see the following MCP tools listed:
For quick installation, use one of the one-click install buttons below:
For manual installation, add the following JSON block to your User Settings (JSON) file in VS Code. You can do this by pressing Ctrl + Shift + P
and typing Preferences: Open User Settings (JSON)
.
Optionally, you can add it to a file called .vscode/mcp.json
in your workspace. This will allow you to share the configuration with others.
Note that the
mcp
key is not needed in the.vscode/mcp.json
file.
{
"mcp": {
"inputs": [
{
"type": "promptString",
"id": "vaultPath",
"description": "Path to Obsidian vault"
}
],
"servers": {
"obsidian": {
"command": "npx",
"args": ["-y", "mcp-obsidian", "${input:vaultPath}"]
}
}
}
}
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by topoteretes
Enables AI agents to store, retrieve, and reason over past conversations, documents, images, and audio transcriptions by loading data into graph and vector databases with minimal code.
by basicmachines-co
Basic Memory is a local-first knowledge management system that allows users to build a persistent semantic graph from conversations with AI assistants. It addresses the ephemeral nature of most LLM interactions by providing a structured, bi-directional knowledge base that both humans and LLMs can read and write to.
by qdrant
Provides a semantic memory layer on top of the Qdrant vector search engine, enabling storage and retrieval of information via the Model Context Protocol.
by GreatScottyMac
A database‑backed MCP server that stores project decisions, progress, architecture, custom data, and vector embeddings, allowing AI assistants in IDEs to retrieve precise, up‑to‑date context for generation tasks.
by StevenStavrakis
Enables AI assistants to read, create, edit, move, delete, and organize notes and tags within an Obsidian vault.
by mem0ai
Provides tools to store, retrieve, and semantically search coding preferences via an SSE endpoint for integration with MCP clients.
by graphlit
Enables integration between MCP clients and the Graphlit platform, providing ingestion, retrieval, RAG, and publishing capabilities across a wide range of data sources and tools.
by chroma-core
Provides vector, full‑text, and metadata‑based retrieval powered by Chroma for LLM applications, supporting in‑memory, persistent, HTTP, and cloud clients as well as multiple embedding functions.
by andrea9293
MCP Documentation Server is a TypeScript-based server that provides local document management and AI-powered semantic search capabilities, designed to bridge the AI knowledge gap.