by opensearch-project
Provides a Model Context Protocol interface that lets AI assistants query, manage, and analyze OpenSearch clusters through both stdio and streaming transports.
Enables AI assistants and large language models to interact with OpenSearch clusters via a standardized protocol, exposing common search, mapping, health, and shard operations as callable tools.
pip install opensearch-mcp-server-py
opensearch-mcp-server
). Run it directly or wrap it in a service manager.OPENSEARCH_USERNAME
, OPENSEARCH_PASSWORD
, or AWS_REGION
.OPENSEARCH_DISABLED_CATEGORIES=core_tools
.Q: How do I install the server?
A: Use pip install opensearch-mcp-server-py
and then run the provided entry point.
Q: Which transport should I choose? A: For command‑line integration use stdio; for web‑based agents use SSE or streamable HTTP.
Q: How can I disable specific tools?
A: Set OPENSEARCH_DISABLED_CATEGORIES
(e.g., core_tools
) or use a custom tool filter as described in the User Guide.
Q: Do I need an AWS account? A: Only if you want IAM‑based authentication; otherwise basic auth with username/password works.
Q: Can I add my own tools? A: Yes—follow the developer guide to implement additional MCP tools and register them under a new category.
opensearch-mcp-server-py is a Model Context Protocol (MCP) server for OpenSearch that enables AI assistants to interact with OpenSearch clusters. It provides a standardized interface for AI models to perform operations like searching indices, retrieving mappings, and managing shards through both stdio and streaming (SSE/Streamable HTTP) protocols.
Key features:
Opensearch-mcp-server-py can be installed from PyPI via pip:
pip install opensearch-mcp-server-py
By default, only core tools are enabled to provide essential OpenSearch functionality:
Core tools are grouped under the core_tools
category and can be disabled at once using OPENSEARCH_DISABLED_CATEGORIES=core_tools
. Avoid creating custom categories with this name as they will override the built-in category.
The following tools are available but disabled by default. To enable them, see the Tool Filter section in the User Guide.
ListIndexTool
opensearch_url
(optional): The OpenSearch cluster URL to connect toindex
(optional): The name of the index to get detailed information for. If provided, returns detailed information about this specific index instead of listing all indices.IndexMappingTool
opensearch_url
(optional): The OpenSearch cluster URL to connect toindex
(required): The name of the index to retrieve mappings forSearchIndexTool
opensearch_url
(optional): The OpenSearch cluster URL to connect toindex
(required): The name of the index to search inquery
(required): The search query in OpenSearch Query DSL formatGetShardsTool
opensearch_url
(optional): The OpenSearch cluster URL to connect toindex
(required): The name of the index to get shard information forClusterHealthTool
opensearch_url
(optional): The OpenSearch cluster URL to connect toindex
(optional): Limit health reporting to a specific indexCountTool
opensearch_url
(optional): The OpenSearch cluster URL to connect toindex
(optional): The name of the index to count documents inbody
(optional): Query in JSON format to filter documentsExplainTool
opensearch_url
(optional): The OpenSearch cluster URL to connect toindex
(required): The name of the index to retrieve the document fromid
(required): The document ID to explainbody
(required): Query in JSON format to explain against the documentMsearchTool
opensearch_url
(optional): The OpenSearch cluster URL to connect toindex
(optional): Default index to search inbody
(required): Multi-search request body in NDJSON formatGetClusterStateTool
opensearch_url
(optional): The OpenSearch cluster URL to connect tometric
(optional): Limit the information returned to the specified metrics. Options include: _all, blocks, metadata, nodes, routing_table, routing_nodes, master_node, versionindex
(optional): Limit the information returned to the specified indicesGetSegmentsTool
opensearch_url
(optional): The OpenSearch cluster URL to connect toindex
(optional): Limit the information returned to the specified indices. If not provided, returns segments for all indicesCatNodesTool
opensearch_url
(optional): The OpenSearch cluster URL to connect tometrics
(optional): A comma-separated list of metrics to display. Available metrics include: id, name, ip, port, role, master, heap.percent, ram.percent, cpu, load_1m, load_5m, load_15m, disk.total, disk.used, disk.avail, disk.used_percentGetNodesTool
opensearch_url
(optional): The OpenSearch cluster URL to connect tonode_id
(optional): A comma-separated list of node IDs or names to limit the returned information. Supports node filters like _local, _master, master:true, data:false, etc. Defaults to _all.metric
(optional): A comma-separated list of metric groups to include in the response. Options include: settings, os, process, jvm, thread_pool, transport, http, plugins, ingest, aggregations, indices. Defaults to all metrics.GetIndexInfoTool
opensearch_url
(optional): The OpenSearch cluster URL to connect toindex
(required): The name of the index to get detailed information for. Wildcards are supported.GetIndexStatsTool
opensearch_url
(optional): The OpenSearch cluster URL to connect toindex
(required): The name of the index to get statistics for. Wildcards are supported.metric
(optional): Limit the information returned to the specified metrics. Options include: _all, completion, docs, fielddata, flush, get, indexing, merge, query_cache, refresh, request_cache, search, segments, store, warmer, bulkGetQueryInsightsTool
opensearch_url
(optional): The OpenSearch cluster URL to connect toGetNodesHotThreadsTool
opensearch_url
(optional): The OpenSearch cluster URL to connect toGetAllocationTool
opensearch_url
(optional): The OpenSearch cluster URL to connect toGetLongRunningTasksTool
opensearch_url
(optional): The OpenSearch cluster URL to connect tolimit
(optional): The maximum number of tasks to return. Default is 10.More tools coming soon. Click here
For detailed usage instructions, configuration options, and examples, please see the User Guide.
Interested in contributing? Check out our:
This project has adopted the Amazon Open Source Code of Conduct. For more information see the Code of Conduct FAQ, or contact opensource-codeofconduct@amazon.com with any additional questions or comments.
This project is licensed under the Apache v2.0 License.
Copyright 2020-2021 Amazon.com, Inc. or its affiliates. All Rights Reserved.
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.