by cr7258
elasticsearch-mcp-server is a Model Context Protocol (MCP) server implementation that facilitates interaction with Elasticsearch and OpenSearch. It provides tools for searching documents, analyzing indices, and managing clusters.
elasticsearch-mcp-server is a Model Context Protocol (MCP) server implementation that facilitates interaction with Elasticsearch and OpenSearch. It provides a set of tools for searching documents, analyzing indices, and managing clusters.
To use elasticsearch-mcp-server, you first need to start an Elasticsearch or OpenSearch cluster, typically using Docker Compose. After the cluster is running, you can configure the MCP server to connect to it. The server can be run using uvx
(for automatic installation from PyPI) or uv
(for local development). You can configure environment variables for Elasticsearch/OpenSearch host, username, and password. The server supports Stdio, SSE (Server-Sent Events), and Streamable HTTP transports.
general_api_request
: Perform general HTTP API requests to Elasticsearch/OpenSearch.list_indices
: List all indices.get_index
: Retrieve information about one or more indices.create_index
: Create a new index.delete_index
: Delete an index.search_documents
: Search for documents.index_document
: Create or update a document.get_document
: Get a document by ID.delete_document
: Delete a document by ID.delete_by_query
: Delete documents matching a specific query.get_cluster_health
: Get basic information about cluster health.get_cluster_stats
: Get high-level overview of cluster statistics.list_aliases
: List all aliases.get_alias
: Get alias information for a specific index.put_alias
: Create or update an alias.delete_alias
: Delete an alias.Q: What are the default credentials for Elasticsearch and OpenSearch when using Docker Compose?
A: For Elasticsearch, the default username is elastic
and password is test123
. For OpenSearch, the default username is admin
and password is admin
.
Q: How can I access Kibana/OpenSearch Dashboards?
A: You can access them from http://localhost:5601
after starting the cluster with Docker Compose.
Q: What are the different ways to run the elasticsearch-mcp-server?
A: You can run it using uvx
(installs from PyPI) or uv
(for local development). It supports Stdio, SSE, and Streamable HTTP transports.
A Model Context Protocol (MCP) server implementation that provides Elasticsearch and OpenSearch interaction. This server enables searching documents, analyzing indices, and managing cluster through a set of tools.
https://github.com/user-attachments/assets/f7409e31-fac4-4321-9c94-b0ff2ea7ff15
general_api_request
: Perform a general HTTP API request. Use this tool for any Elasticsearch/OpenSearch API that does not have a dedicated tool.list_indices
: List all indices.get_index
: Returns information (mappings, settings, aliases) about one or more indices.create_index
: Create a new index.delete_index
: Delete an index.search_documents
: Search for documents.index_document
: Creates or updates a document in the index.get_document
: Get a document by ID.delete_document
: Delete a document by ID.delete_by_query
: Deletes documents matching the provided query.get_cluster_health
: Returns basic information about the health of the cluster.get_cluster_stats
: Returns high-level overview of cluster statistics.list_aliases
: List all aliases.get_alias
: Get alias information for a specific index.put_alias
: Create or update an alias for a specific index.delete_alias
: Delete an alias for a specific index.Copy the .env.example
file to .env
and update the values accordingly.
Start the Elasticsearch/OpenSearch cluster using Docker Compose:
# For Elasticsearch
docker-compose -f docker-compose-elasticsearch.yml up -d
# For OpenSearch
docker-compose -f docker-compose-opensearch.yml up -d
The default Elasticsearch username is elastic
and password is test123
. The default OpenSearch username is admin
and password is admin
.
You can access Kibana/OpenSearch Dashboards from http://localhost:5601.
Using uvx
will automatically install the package from PyPI, no need to clone the repository locally. Add the following configuration to 's config file claude_desktop_config.json
.
// For Elasticsearch
{
"mcpServers": {
"elasticsearch-mcp-server": {
"command": "uvx",
"args": [
"elasticsearch-mcp-server"
],
"env": {
"ELASTICSEARCH_HOSTS": "https://localhost:9200",
"ELASTICSEARCH_USERNAME": "elastic",
"ELASTICSEARCH_PASSWORD": "test123"
}
}
}
}
// For OpenSearch
{
"mcpServers": {
"opensearch-mcp-server": {
"command": "uvx",
"args": [
"opensearch-mcp-server"
],
"env": {
"OPENSEARCH_HOSTS": "https://localhost:9200",
"OPENSEARCH_USERNAME": "admin",
"OPENSEARCH_PASSWORD": "admin"
}
}
}
}
Using uv
requires cloning the repository locally and specifying the path to the source code. Add the following configuration to Claude Desktop's config file claude_desktop_config.json
.
// For Elasticsearch
{
"mcpServers": {
"elasticsearch-mcp-server": {
"command": "uv",
"args": [
"--directory",
"path/to/src/elasticsearch_mcp_server",
"run",
"elasticsearch-mcp-server"
],
"env": {
"ELASTICSEARCH_HOSTS": "https://localhost:9200",
"ELASTICSEARCH_USERNAME": "elastic",
"ELASTICSEARCH_PASSWORD": "test123"
}
}
}
}
// For OpenSearch
{
"mcpServers": {
"opensearch-mcp-server": {
"command": "uv",
"args": [
"--directory",
"path/to/src/elasticsearch_mcp_server",
"run",
"opensearch-mcp-server"
],
"env": {
"OPENSEARCH_HOSTS": "https://localhost:9200",
"OPENSEARCH_USERNAME": "admin",
"OPENSEARCH_PASSWORD": "admin"
}
}
}
}
# export environment variables
export ELASTICSEARCH_HOSTS="https://localhost:9200"
export ELASTICSEARCH_USERNAME="elastic"
export ELASTICSEARCH_PASSWORD="test123"
# By default, the SSE MCP server will serve on http://127.0.0.1:8000/sse
uvx elasticsearch-mcp-server --transport sse
# The host, port, and path can be specified using the --host, --port, and --path options
uvx elasticsearch-mcp-server --transport sse --host 0.0.0.0 --port 8000 --path /sse
# By default, the SSE MCP server will serve on http://127.0.0.1:8000/sse
uv run src/server.py elasticsearch-mcp-server --transport sse
# The host, port, and path can be specified using the --host, --port, and --path options
uv run src/server.py elasticsearch-mcp-server --transport sse --host 0.0.0.0 --port 8000 --path /sse
# export environment variables
export ELASTICSEARCH_HOSTS="https://localhost:9200"
export ELASTICSEARCH_USERNAME="elastic"
export ELASTICSEARCH_PASSWORD="test123"
# By default, the Streamable HTTP MCP server will serve on http://127.0.0.1:8000/mcp
uvx elasticsearch-mcp-server --transport streamable-http
# The host, port, and path can be specified using the --host, --port, and --path options
uvx elasticsearch-mcp-server --transport streamable-http --host 0.0.0.0 --port 8000 --path /mcp
# By default, the Streamable HTTP MCP server will serve on http://127.0.0.1:8000/mcp
uv run src/server.py elasticsearch-mcp-server --transport streamable-http
# The host, port, and path can be specified using the --host, --port, and --path options
uv run src/server.py elasticsearch-mcp-server --transport streamable-http --host 0.0.0.0 --port 8000 --path /mcp
This project is licensed under the Apache License Version 2.0 - see the LICENSE file for details.
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by googleapis
Provides a configurable MCP server that abstracts connection pooling, authentication, observability, and tool management to accelerate development of database‑backed AI tools.
by bytebase
DBHub is a universal database gateway that implements the Model Context Protocol (MCP) server interface, enabling MCP-compatible clients to interact with various databases.
by neo4j-contrib
Provides Model Context Protocol servers for interacting with Neo4j databases, managing Aura instances, and handling personal knowledge graph memory through natural‑language interfaces.
by mongodb-js
Provides a Model Context Protocol server that connects to MongoDB databases and Atlas clusters, exposing a rich set of tools for querying, managing, and administering data and infrastructure.
by benborla
A Model Context Protocol (MCP) server that provides read-only access to MySQL databases, enabling Large Language Models (LLMs) to inspect database schemas and execute read-only queries.
by ClickHouse
Provides tools that let AI assistants run read‑only SQL queries against ClickHouse clusters or the embedded chDB engine, plus a health‑check endpoint for service monitoring.
by elastic
Provides direct, natural‑language access to Elasticsearch indices via the Model Context Protocol, allowing AI agents to query and explore data without writing DSL.
by motherduckdb
Provides an MCP server that enables SQL analytics on DuckDB and MotherDuck databases, allowing AI assistants and IDEs to execute queries via a unified interface.
by redis
Provides a natural language interface for agentic applications to manage and search data in Redis efficiently.