by runjivu
Provides AWS CLI functionality through a standardized MCP interface, enabling portable use across various AI tools and environments.
The server replicates the use_aws
tool from the Amazon Q Developer CLI, exposing AWS CLI commands via the Model Context Protocol (MCP). It allows AI‑enabled applications to execute AWS operations safely and consistently without directly invoking the AWS CLI.
cargo install use_aws_mcp
../target/release/use_aws_mcp
(the server communicates over stdin/stdout using JSON‑RPC).{
"mcpServers": {
"use_aws_mcp": {
"name": "use_aws_mcp",
"command": "use_aws_mcp",
"timeout": 300,
"env": {},
"disabled": false
}
}
}
use_aws
tool from the client by providing the required arguments (service, operation, region, etc.).Q: Do I need the AWS CLI installed? A: Yes, the AWS CLI must be present and configured with valid credentials.
Q: Can I use this with non‑shell MCP clients like Cursor?
A: Non‑shell clients do not inherit environment variables automatically, so you should specify the profile_name
in the request.
Q: How are large outputs handled? A: Responses exceeding 100 KB are truncated to prevent memory issues.
Q: What determines a read‑only operation?
A: Operations whose names start with get
, describe
, list
, ls
, search
, or batch_get
are considered read‑only.
Q: How do I enable debug logging?
A: Run the server with RUST_LOG=use_aws=debug ./target/release/use_aws_mcp
.
🌟 amazon-q-cli is great, and it is great because it has use_aws
MCP tool to interact with AWS API.
💡 Wouldn't it be greater if this use_aws
was portable, and use it across different AI tools, whichever you're currently using?
⚡ use_aws_mcp
is a standalone Model Context Protocol (MCP) server that provides AWS CLI functionality through a standardized interface.
This server replicates the functionality of the use_aws
tool from the Amazon Q Developer CLI.
Usage with Avante, MCPHub in nvim
Usage with Cursor
curl https://sh.rustup.rs -sSf | sh
cargo build --release
The binary will be available at target/release/use_aws
.
To use this server with an MCP client, first install it using Cargo:
cargo install use_aws_mcp
Then configure your MCP client with:
{
"mcpServers": {
"use_aws_mcp": {
"name": "use_aws_mcp",
"command": "use_aws_mcp",
"timeout": 300,
"env": {},
"disabled": false
}
}
}
With q cli, mcp clients are shell process, so credentials env like AWS_DEFAULT_PROFILE
are automatically transfered to mcp server.
However, non shell mcp clients like cursor cannot take advantage of this, so it is best advised to require mcp clients directly to use specific aws profile.
📋 User Flow:
aws sso login
./target/release/use_aws_mcp
The server communicates via stdin/stdout using JSON-RPC protocol.
The server provides human-readable descriptions of AWS CLI commands. You can see this in action by running the example:
cargo run --example description_demo
This will output something like:
Running aws cli command:
Service name: s3
Operation name: list-buckets
Parameters:
- max-items: "10"
- query: "Buckets[].Name"
Profile name: development
Region: us-west-2
Label: List S3 buckets with query
✅ This command is read-only (no acceptance required)
The server provides a single tool called use_aws
with the following schema:
{
"name": "use_aws",
"description": "Execute AWS CLI commands with proper parameter handling and safety checks",
"inputSchema": {
"type": "object",
"properties": {
"service_name": {
"type": "string",
"description": "AWS service name (e.g., s3, ec2, lambda)"
},
"operation_name": {
"type": "string",
"description": "AWS CLI operation name (e.g., list-buckets, describe-instances)"
},
"parameters": {
"type": "object",
"description": "Optional parameters for the AWS CLI command",
"additionalProperties": true
},
"region": {
"type": "string",
"description": "AWS region (e.g., us-west-2, eu-west-1)"
},
"profile_name": {
"type": "string",
"description": "Optional AWS profile name"
},
"label": {
"type": "string",
"description": "Optional label for the operation"
}
},
"required": ["service_name", "operation_name", "region"]
}
}
{
"name": "use_aws",
"arguments": {
"service_name": "s3",
"operation_name": "ls",
"region": "us-west-2"
}
}
{
"name": "use_aws",
"arguments": {
"service_name": "ec2",
"operation_name": "describe-instances",
"region": "us-west-2",
"parameters": {
"instance-ids": "i-1234567890abcdef0"
}
}
}
{
"name": "use_aws",
"arguments": {
"service_name": "lambda",
"operation_name": "list-functions",
"region": "us-west-2",
"profile_name": "development"
}
}
The server automatically detects read-only operations based on the operation name prefix:
get
, describe
, list
, ls
, search
, batch_get
Large outputs are automatically truncated to prevent memory issues, with a maximum response size of 100KB.
cargo test
cargo build
RUST_LOG=use_aws=debug cargo run
# Run the description demo
cargo run --example description_demo
The project is structured as follows:
src/lib.rs
: Core library with types and constantssrc/error.rs
: Error handling typessrc/use_aws.rs
: Core AWS CLI functionality (replicated from original)src/mcp_server.rs
: MCP server implementationsrc/main.rs
: Binary entry pointexamples/description_demo.rs
: Example demonstrating command descriptionsIf you do not have Cargo (the Rust package manager) installed, you can get it by installing Rust using rustup:
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
Follow the on-screen instructions to complete the installation. After installation, restart your terminal and ensure Cargo is available by running:
cargo --version
You should see the installed Cargo version printed.
This project is distributed as a Rust crate. The following dependencies are managed automatically by Cargo:
tokio
serde
serde_json
eyre
bstr
convert_case
async-trait
thiserror
tracing
tracing-subscriber
crossterm
test/dev dependencies:
tokio-test
You do not need to install these manually; Cargo will handle them during installation.
MIT, Apache-2.0
This server executes AWS CLI commands, which may have security implications:
Run with debug logging to see detailed information:
RUST_LOG=use_aws=debug ./target/release/use_aws
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "use_aws_mcp": { "command": "use_aws_mcp", "args": [], "env": {} } } }
Discover more MCP servers with similar functionality and use cases
by daytonaio
Provides a secure, elastic sandbox environment for executing AI‑generated code with isolated runtimes and sub‑90 ms provisioning.
by awslabs
Specialized servers that expose AWS capabilities through the Model Context Protocol, allowing AI assistants and other applications to retrieve up‑to‑date AWS documentation, manage infrastructure, query services, and perform workflow automation directly from their context.
by awslabs
AWS MCP Servers allow AI agents to interact with and manage a wide range of AWS services using natural language commands. They enable AI-powered cloud management, automated DevOps, and data-driven insights within the AWS ecosystem.
by cloudflare
Remote Model Context Protocol endpoints that let AI clients read, process, and act on data across Cloudflare services such as Workers, Radar, Observability, and more.
by supabase-community
Enables AI assistants to interact directly with Supabase projects, allowing them to query databases, fetch configuration, manage tables, and perform other project‑level operations.
by Azure
azure-mcp is a server that implements the Model Context Protocol (MCP) to connect AI agents with Azure services. It allows developers to interact with Azure resources like Storage, Cosmos DB, and the Azure CLI using natural language commands within their development environment.
by Flux159
MCP Server for Kubernetes management commands, enabling interaction with Kubernetes clusters to manage pods, deployments, and services.
by strowk
Provides a Golang‑based server that enables interaction with Kubernetes clusters via prompts, allowing listing of contexts, namespaces, resources, nodes, pods, events, logs, and executing commands inside pods.
by jamsocket
Run arbitrary Python code securely in persistent, stateful sandboxes that remain available indefinitely.