by pinecone-io
Retrieves information from Pinecone Assistant and returns multiple results via a configurable MCP server.
Pinecone Assistant MCP Server provides an implementation of the Model Context Protocol (MCP) that connects to Pinecone Assistant, allowing clients to query the assistant and receive a configurable number of result items.
Docker (recommended)
docker run -i --rm \
-e PINECONE_API_KEY=<YOUR_PINECONE_API_KEY_HERE> \
-e PINECONE_ASSISTANT_HOST=<YOUR_PINECONE_ASSISTANT_HOST_HERE> \
pinecone/assistant-mcp
PINECONE_API_KEY
(required).PINECONE_ASSISTANT_HOST
(default: https://prod-1-data.ke.pinecone.io).LOG_LEVEL
if needed.Building from source
# Install Rust first
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
git clone https://github.com/pinecone-io/assistant-mcp.git
cd assistant-mcp
cargo build --release
./target/release/assistant-mcp
Provide the same environment variables as above.
Integration with Claude Desktop
Add the following snippet to claude_desktop_config.json
:
{
"mcpServers": {
"pinecone-assistant": {
"command": "docker",
"args": ["run", "-i", "--rm", "-e", "PINECONE_API_KEY", "-e", "PINECONE_ASSISTANT_HOST", "pinecone/assistant-mcp"],
"env": {
"PINECONE_API_KEY": "<YOUR_PINECONE_API_KEY_HERE>",
"PINECONE_ASSISTANT_HOST": "<YOUR_PINECONE_ASSISTANT_HOST_HERE>"
}
}
}
}
LOG_LEVEL
.Q: Do I need a Pinecone account? A: Yes, you must have a Pinecone API key and an Assistant created in the Pinecone console.
Q: Can I run the server without Docker?
A: Absolutely. Build the Rust source with cargo build --release
and run the resulting binary, supplying the same environment variables.
Q: How do I change the default assistant host?
A: Set the PINECONE_ASSISTANT_HOST
environment variable to the desired endpoint when starting the container or binary.
Q: What logging levels are available?
A: Use LOG_LEVEL=debug
, info
(default), warn
, or error
to control verbosity.
Q: Is the number of results configurable per request? A: The server respects the request payload's parameters; consult the MCP spec for the exact field name to adjust result count.
An MCP server implementation for retrieving information from Pinecone Assistant.
To build the Docker image:
docker build -t pinecone/assistant-mcp .
Run the server with your Pinecone API key:
docker run -i --rm \
-e PINECONE_API_KEY=<YOUR_PINECONE_API_KEY_HERE> \
-e PINECONE_ASSISTANT_HOST=<YOUR_PINECONE_ASSISTANT_HOST_HERE> \
pinecone/assistant-mcp
PINECONE_API_KEY
(required): Your Pinecone API keyPINECONE_ASSISTANT_HOST
(optional): Pinecone Assistant API host (default: https://prod-1-data.ke.pinecone.io)LOG_LEVEL
(optional): Logging level (default: info)Add this to your claude_desktop_config.json
:
{
"mcpServers": {
"pinecone-assistant": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"PINECONE_API_KEY",
"-e",
"PINECONE_ASSISTANT_HOST",
"pinecone/assistant-mcp"
],
"env": {
"PINECONE_API_KEY": "<YOUR_PINECONE_API_KEY_HERE>",
"PINECONE_ASSISTANT_HOST": "<YOUR_PINECONE_ASSISTANT_HOST_HERE>"
}
}
}
}
If you prefer to build from source without Docker:
cargo build --release
target/release/assistant-mcp
export PINECONE_API_KEY=<YOUR_PINECONE_API_KEY_HERE>
export PINECONE_ASSISTANT_HOST=<YOUR_PINECONE_ASSISTANT_HOST_HERE>
# Run the inspector alone
npx @modelcontextprotocol/inspector cargo run
# Or run with Docker directly through the inspector
npx @modelcontextprotocol/inspector -- docker run -i --rm -e PINECONE_API_KEY -e PINECONE_ASSISTANT_HOST pinecone/assistant-mcp
This project is licensed under the terms specified in the LICENSE file.
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.