by jokemanfire
Manage Containerd CRI interfaces through an MCP server written in Rust, providing full runtime and image service capabilities.
Mcp Containerd implements an MCP server that speaks Containerd's CRI (Container Runtime Interface) APIs. It exposes runtime and image services, allowing AI‑driven tools or other clients to query and control containers and images via the Model Context Protocol.
cargo build --release
cargo run --release
unix:///run/containerd/containerd.sock
.simple-chat-client
example (see the linked repository) to send natural‑language requests that are translated into CRI calls.unix:///run/containerd/containerd.sock
but can be changed in future configuration files.list_containers
).This is an MCP server implemented using the RMCP (Rust Model Context Protocol) library for operating Containerd's CRI interfaces.
cargo build --release
cargo run --release
By default, the service will connect to the unix:///run/containerd/containerd.sock
endpoint.
The simple-chat-client allows you to interact with the MCP Containerd service: simple-chat-client has moved to simple-chat-client
Example interaction:
> please give me a list of containers
AI: Listing containers...
Tool: list_containers
Result: {"containers":[...]}
> please give me a list of images
AI: Here are the images in your containerd:
Tool: list_images
Result: {"images":[...]}
The MCP server includes the following main components:
version
service: Provides CRI version informationruntime
service: Provides container and Pod runtime operationsimage
service: Provides container image operationsCurrently using default configuration. Future versions will support customizing connection parameters through configuration files.
Apache-2.0
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by daytonaio
Provides a secure, elastic sandbox environment for executing AI‑generated code with isolated runtimes and sub‑90 ms provisioning.
by awslabs
Specialized servers that expose AWS capabilities through the Model Context Protocol, allowing AI assistants and other applications to retrieve up‑to‑date AWS documentation, manage infrastructure, query services, and perform workflow automation directly from their context.
by awslabs
AWS MCP Servers allow AI agents to interact with and manage a wide range of AWS services using natural language commands. They enable AI-powered cloud management, automated DevOps, and data-driven insights within the AWS ecosystem.
by cloudflare
Remote Model Context Protocol endpoints that let AI clients read, process, and act on data across Cloudflare services such as Workers, Radar, Observability, and more.
by supabase-community
Enables AI assistants to interact directly with Supabase projects, allowing them to query databases, fetch configuration, manage tables, and perform other project‑level operations.
by Azure
azure-mcp is a server that implements the Model Context Protocol (MCP) to connect AI agents with Azure services. It allows developers to interact with Azure resources like Storage, Cosmos DB, and the Azure CLI using natural language commands within their development environment.
by Flux159
MCP Server for Kubernetes management commands, enabling interaction with Kubernetes clusters to manage pods, deployments, and services.
by strowk
Provides a Golang‑based server that enables interaction with Kubernetes clusters via prompts, allowing listing of contexts, namespaces, resources, nodes, pods, events, logs, and executing commands inside pods.
by jamsocket
Run arbitrary Python code securely in persistent, stateful sandboxes that remain available indefinitely.