by sirmews
mcp-pinecone is a Model Context Protocol (MCP) server that enables reading from and writing to Pinecone, providing rudimentary RAG (Retrieval Augmented Generation) capabilities for AI models like Claude Desktop.
mcp-pinecone is an MCP (Model Context Protocol) server designed to facilitate interaction with Pinecone, a vector database. It allows AI models, such as Claude Desktop, to perform operations like searching and uploading records to a Pinecone index, thereby enabling basic Retrieval Augmented Generation (RAG) features.
mcp-pinecone can be installed and configured for use with Claude Desktop. You can install it via Smithery using npx -y @smithery/cli install mcp-pinecone --client claude
or locally using uvx install mcp-pinecone
or uv pip install mcp-pinecone
. After installation, you need to configure Claude Desktop by adding mcpServers
entries to its configuration file (claude_desktop_config.json
), specifying the command and arguments for running mcp-pinecone
. You will also need a Pinecone account, an API key, and an index name.
semantic-search
: Search for records in the Pinecone index.read-document
: Read a specific document from the Pinecone index.list-documents
: List all documents stored in the Pinecone index.pinecone-stats
: Get statistics about your Pinecone index (number of records, dimensions, namespaces).process-document
: Process documents by chunking, embedding (via Pinecone's inference API), and upserting them into the Pinecone index.Q: What is the Model Context Protocol (MCP)? A: The Model Context Protocol is a standard that allows AI models to interact with external services and data sources, extending their capabilities beyond their pre-trained knowledge.
Q: How does mcp-pinecone handle embeddings? A: mcp-pinecone generates embeddings using Pinecone's Inference API.
Q: What is uv
and why is it recommended for installation?
A: uv
is a fast Python package installer and resolver. It is recommended for its speed and efficiency in managing Python environments.
Q: How can I debug mcp-pinecone?
A: You can debug mcp-pinecone using the MCP Inspector, which can be launched via npx @modelcontextprotocol/inspector uv --directory {project_dir} run mcp-pinecone
.
Read and write to a Pinecone index.
The server implements the ability to read and write to a Pinecone index.
semantic-search
: Search for records in the Pinecone index.read-document
: Read a document from the Pinecone index.list-documents
: List all documents in the Pinecone index.pinecone-stats
: Get stats about the Pinecone index, including the number of records, dimensions, and namespaces.process-document
: Process a document into chunks and upsert them into the Pinecone index. This performs the overall steps of chunking, embedding, and upserting.Note: embeddings are generated via Pinecone's inference API and chunking is done with a token-based chunker. Written by copying a lot from langchain and debugging with Claude.
To install Pinecone MCP Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install mcp-pinecone --client claude
Recommend using uv to install the server locally for Claude.
uvx install mcp-pinecone
OR
uv pip install mcp-pinecone
Add your config as described below.
On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
Note: You might need to use the direct path to uv
. Use which uv
to find the path.
Development/Unpublished Servers Configuration
"mcpServers": {
"mcp-pinecone": {
"command": "uv",
"args": [
"--directory",
"{project_dir}",
"run",
"mcp-pinecone"
]
}
}
Published Servers Configuration
"mcpServers": {
"mcp-pinecone": {
"command": "uvx",
"args": [
"--index-name",
"{your-index-name}",
"--api-key",
"{your-secret-api-key}",
"mcp-pinecone"
]
}
}
You can sign up for a Pinecone account here.
Create a new index in Pinecone, replacing {your-index-name}
and get an API key from the Pinecone dashboard, replacing {your-secret-api-key}
in the config.
To prepare the package for distribution:
uv sync
uv build
This will create source and wheel distributions in the dist/
directory.
uv publish
Note: You'll need to set PyPI credentials via environment variables or command flags:
--token
or UV_PUBLISH_TOKEN
--username
/UV_PUBLISH_USERNAME
and --password
/UV_PUBLISH_PASSWORD
Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.
You can launch the MCP Inspector via npm
with this command:
npx @modelcontextprotocol/inspector uv --directory {project_dir} run mcp-pinecone
Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.
This project is licensed under the MIT License. See the LICENSE file for details.
The source code is available on GitHub.
Send your ideas and feedback to me on Bluesky or by opening an issue.
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.