by pab1it0
A Model Context Protocol (MCP) server that enables AI assistants to query and analyze Prometheus metrics through standardized interfaces.
Prometheus MCP Server is a Model Context Protocol (MCP) server designed to provide AI assistants with standardized access to Prometheus metrics and queries. It allows AI models to execute PromQL queries and analyze metric data directly through defined MCP interfaces.
PROMETHEUS_URL
and optionally authentication credentials (PROMETHEUS_USERNAME
, PROMETHEUS_PASSWORD
, or PROMETHEUS_TOKEN
) via a .env
file or system environment variables. For multi-tenant setups, ORG_ID
can also be configured.execute_query
: Execute a PromQL instant query.execute_range_query
: Execute a PromQL range query with specified time intervals.list_metrics
: List all available metrics.get_metric_metadata
: Get metadata for a specific metric.get_targets
: Get information about all scrape targets.Q: What is MCP? A: MCP stands for Model Context Protocol, a standardized interface that allows AI models to interact with various data sources and services.
Q: Can I choose which tools are available to the AI assistant? A: Yes, the list of tools is configurable, allowing you to select which functionalities are exposed to the MCP client, optimizing for context window usage and specific needs.
Q: How do I handle authentication for Prometheus?
A: The server supports basic authentication (username/password) and bearer token authentication, configured via environment variables (PROMETHEUS_USERNAME
, PROMETHEUS_PASSWORD
, PROMETHEUS_TOKEN
).
Q: How can I contribute to the project?
A: Contributions are welcome! You can open an issue or submit a pull request. The project uses uv
for dependency management and pytest
for testing.
A Model Context Protocol (MCP) server for Prometheus.
This provides access to your Prometheus metrics and queries through standardized MCP interfaces, allowing AI assistants to execute PromQL queries and analyze your metrics data.
Execute PromQL queries against Prometheus
Discover and explore metrics
Authentication support
Docker containerization support
Provide interactive tools for AI assistants
The list of tools is configurable, so you can choose which tools you want to make available to the MCP client. This is useful if you don't use certain functionality or if you don't want to take up too much of the context window.
Ensure your Prometheus server is accessible from the environment where you'll run this MCP server.
Configure the environment variables for your Prometheus server, either through a .env
file or system environment variables:
# Required: Prometheus configuration
PROMETHEUS_URL=http://your-prometheus-server:9090
# Optional: Authentication credentials (if needed)
# Choose one of the following authentication methods if required:
# For basic auth
PROMETHEUS_USERNAME=your_username
PROMETHEUS_PASSWORD=your_password
# For bearer token auth
PROMETHEUS_TOKEN=your_token
# Optional: For multi-tenant setups like Cortex, Mimir or Thanos
ORG_ID=your_organization_id
{
"mcpServers": {
"prometheus": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"PROMETHEUS_URL",
"ghcr.io/pab1it0/prometheus-mcp-server:latest"
],
"env": {
"PROMETHEUS_URL": "<url>"
}
}
}
}
Contributions are welcome! Please open an issue or submit a pull request if you have any suggestions or improvements.
This project uses uv
to manage dependencies. Install uv
following the instructions for your platform:
curl -LsSf https://astral.sh/uv/install.sh | sh
You can then create a virtual environment and install the dependencies with:
uv venv
source .venv/bin/activate # On Unix/macOS
.venv\Scripts\activate # On Windows
uv pip install -e .
The project has been organized with a src
directory structure:
prometheus-mcp-server/
├── src/
│ └── prometheus_mcp_server/
│ ├── __init__.py # Package initialization
│ ├── server.py # MCP server implementation
│ ├── main.py # Main application logic
├── Dockerfile # Docker configuration
├── docker-compose.yml # Docker Compose configuration
├── .dockerignore # Docker ignore file
├── pyproject.toml # Project configuration
└── README.md # This file
The project includes a comprehensive test suite that ensures functionality and helps prevent regressions.
Run the tests with pytest:
# Install development dependencies
uv pip install -e ".[dev]"
# Run the tests
pytest
# Run with coverage report
pytest --cov=src --cov-report=term-missing
Tests are organized into:
When adding new features, please also add corresponding tests.
Tool | Category | Description |
---|---|---|
execute_query |
Query | Execute a PromQL instant query against Prometheus |
execute_range_query |
Query | Execute a PromQL range query with start time, end time, and step interval |
list_metrics |
Discovery | List all available metrics in Prometheus |
get_metric_metadata |
Discovery | Get metadata for a specific metric |
get_targets |
Discovery | Get information about all scrape targets |
MIT
Reviews feature coming soon
Stay tuned for community discussions and feedback