by pinecone-io
Enables AI assistants to interact with Pinecone projects, search official documentation, manage indexes, and perform upsert and search operations directly from the developer environment.
node
and npx
are in your PATH..cursor/mcp.json
or equivalent configuration that runs npx -y @pinecone-database/mcp
with your Pinecone API key.search-docs
, list-indexes
, create-index-for-model
, etc.search-docs
tool queries Pinecone's official docs.npx
is used to invoke the package..cursor
folder to make it available across all projects.The Model Context Protocol (MCP) is a standard that allows coding assistants and other AI tools to interact with platforms like Pinecone. The Pinecone Developer MCP Server allows you to connect these tools with Pinecone projects and documentation.
Once connected, AI tools can:
See the docs for more detailed information.
This MCP server is focused on improving the experience of developers working with Pinecone as part of their technology stack. It is intended for use with coding assistants. Pinecone also offers the Assistant MCP, which is designed to provide AI assistants with relevant context sourced from your knowledge base.
To configure the MCP server to access your Pinecone project, you will need to generate an API key using the console. Without an API key, your AI tool will still be able to search documentation. However, it will not be able to manage or query your indexes.
The MCP server requires Node.js. Ensure that node
and npx
are available in your PATH
.
Next, you will need to configure your AI assistant to use the MCP server.
To add the Pinecone MCP server to a project, create a .cursor/mcp.json
file in the project root (if it doesn't already exist) and add the following configuration:
{
"mcpServers": {
"pinecone": {
"command": "npx",
"args": [
"-y", "@pinecone-database/mcp"
],
"env": {
"PINECONE_API_KEY": "<your pinecone api key>"
}
}
}
}
You can check the status of the server in Cursor Settings > MCP.
To enable the server globally, add the configuration to the .cursor/mcp.json
in your home directory instead.
It is recommended to use rules to instruct Cursor on proper usage of the MCP server. Check out the docs for some suggestions.
Use Claude desktop to locate the claude_desktop_config.json
file by navigating to Settings > Developer > Edit Config. Add the following configuration:
{
"mcpServers": {
"pinecone": {
"command": "npx",
"args": [
"-y", "@pinecone-database/mcp"
],
"env": {
"PINECONE_API_KEY": "<your pinecone api key>"
}
}
}
}
Restart Claude desktop. On the new chat screen, you should see a hammer (MCP) icon appear with the new MCP tools available.
To install this as a Gemini CLI extension, run the following command:
gemini extensions install https://github.com/pinecone-io/pinecone-mcp
You will need to provide your Pinecone API key in the PINECONE_API_KEY
environment variable.
export PINECONE_API_KEY=<your pinecone api key>
When you run gemini
and press ctrl+t
, pinecone
should now be shown in the list of installed MCP servers.
Once configured, your AI tool will automatically make use of the MCP to interact with Pinecone. You may be prompted for permission before a tool can be used. Try asking your AI assistant to set up an example index, upload sample data, or search for you!
Pinecone Developer MCP Server provides the following tools for AI assistants to use:
search-docs
: Search the official Pinecone documentation.list-indexes
: Lists all Pinecone indexes.describe-index
: Describes the configuration of an index.describe-index-stats
: Provides statistics about the data in the index, including the number of records and available namespaces.create-index-for-model
: Creates a new index that uses an integrated inference model to embed text as vectors.upsert-records
: Inserts or updates records in an index with integrated inference.search-records
: Searches for records in an index based on a text query, using integrated inference for embedding. Has options for metadata filtering and reranking.cascading-search
: Searches for records across multiple indexes, deduplicating and reranking the results.rerank-documents
: Reranks a collection of records or text documents using a specialized reranking model.Only indexes with integrated inference are supported. Assistants, indexes without integrated inference, standalone embeddings, and vector search are not supported.
We welcome your collaboration in improving the developer MCP experience. Please submit issues in the GitHub issue tracker. Information about contributing can be found in CONTRIBUTING.md.
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "pinecone": { "command": "npx", "args": [ "-y", "@pinecone-database/mcp" ], "env": { "PINECONE_API_KEY": "<YOUR_API_KEY>" } } } }
Discover more MCP servers with similar functionality and use cases
by googleapis
Provides a configurable MCP server that abstracts connection pooling, authentication, observability, and tool management to accelerate development of database‑backed AI tools.
by bytebase
DBHub is a universal database gateway that implements the Model Context Protocol (MCP) server interface, enabling MCP-compatible clients to interact with various databases.
by neo4j-contrib
Provides Model Context Protocol servers for interacting with Neo4j databases, managing Aura instances, and handling personal knowledge graph memory through natural‑language interfaces.
by mongodb-js
Provides a Model Context Protocol server that connects to MongoDB databases and Atlas clusters, exposing a rich set of tools for querying, managing, and administering data and infrastructure.
by benborla
A Model Context Protocol (MCP) server that provides read-only access to MySQL databases, enabling Large Language Models (LLMs) to inspect database schemas and execute read-only queries.
by ClickHouse
Provides tools that let AI assistants run read‑only SQL queries against ClickHouse clusters or the embedded chDB engine, plus a health‑check endpoint for service monitoring.
by elastic
Provides direct, natural‑language access to Elasticsearch indices via the Model Context Protocol, allowing AI agents to query and explore data without writing DSL.
by motherduckdb
Provides an MCP server that enables SQL analytics on DuckDB and MotherDuck databases, allowing AI assistants and IDEs to execute queries via a unified interface.
by redis
Provides a natural language interface for agentic applications to manage and search data in Redis efficiently.