by DefangLabs
A CLI and server for deploying applications from Docker Compose to the cloud.
Defang is a tool that simplifies the process of deploying applications from Docker Compose to various cloud platforms. It offers both a Command-Line Interface (CLI) and a Model Context Protocol (MCP) Server for seamless integration with IDEs. The goal is to enable developers to take their applications from a local Docker Compose setup to a secure and scalable cloud deployment quickly.
Getting started with Defang involves installing the CLI and then using commands to manage your deployments.
Installation: You can install the Defang CLI through various methods:
brew install DefangLabs/defang/defang
eval "$(curl -fsSL s.defang.io/install)"
go install github.com/DefangLabs/defang/src/cmd/cli@latest
nix-env -if https://github.com/DefangLabs/defang/archive/main.tar.gz
or with Flakes:
nix profile install github:DefangLabs/defang#defang-bin --refresh
winget install defang
iwr https://s.defang.io/defang_win_amd64.zip -OutFile defang.zip
Expand-Archive defang.zip . -Force
docker run -it defangio/defang-cli help
Basic Usage:
defang generate
defang compose up
Command Completion:
Defang supports command completion for Bash, Zsh, Fish, and Powershell. Configure it by sourcing the output of defang completion [shell_name]
.
docker-compose.yml
files to cloud environments.defang generate
leverage AI for development assistance.defang compose up
in your project directory.DEFANG_PROVIDER
environment variable.DEFANG_ORG
, DEFANG_BUILD_CONTEXT_LIMIT
, DEFANG_PROVIDER
) to tailor the deployment to your specific needs.Take your app from Docker Compose to a secure and scalable deployment on your favorite cloud in minutes.
The Defang Command-Line Interface (CLI) is designed for developers who prefer to manage their workflows directly from the terminal. It offers full access to Defang’s capabilities, allowing you to build, test, and deploy applications efficiently to the cloud.
The Defang Model Context Protocol (MCP) Server is tailored for developers who work primarily within integrated development environments (IDEs). It enables seamless cloud deployment from supported editors such as Cursor, Windsurf, VS Code, VS Code Insiders and Claude delivering a fully integrated experience without leaving your development environment.
defang generate
defang compose up
Install the Defang CLI from one of the following sources:
Using the Homebrew package manager DefangLabs/defang tap:
brew install DefangLabs/defang/defang
Using a shell script:
eval "$(curl -fsSL s.defang.io/install)"
Using Go:
go install github.com/DefangLabs/defang/src/cmd/cli@latest
Using the Nix package manager:
nix-env -if https://github.com/DefangLabs/defang/archive/main.tar.gz
nix profile install github:DefangLabs/defang#defang-bin --refresh
Using winget:
winget install defang
Using a PowerShell script:
iwr https://s.defang.io/defang_win_amd64.zip -OutFile defang.zip
Expand-Archive defang.zip . -Force
Using the official image from Docker Hub:
docker run -it defangio/defang-cli help
or download the latest binary of the Defang CLI.
The Defang CLI supports command completion for Bash, Zsh, Fish, and Powershell. To get the shell script for command completion, run the following command:
defang completion [bash|zsh|fish|powershell]
If you're using Bash, you can add the following to your ~/.bashrc
file:
source <(defang completion bash)
If you're using Zsh, you can add the following to your ~/.zshrc
file:
source <(defang completion zsh)
or pipe the output to a file called _defang
in the directory with the completions.
If you're using Fish, you can add the following to your ~/.config/fish/config.fish
file:
defang completion fish | source
If you're using Powershell, you can add the following to your $HOME\Documents\PowerShell\Microsoft.PowerShell_profile.ps1
file:
Invoke-Expression -Command (defang completion powershell | Out-String)
The Defang CLI recognizes the following environment variables:
COMPOSE_PROJECT_NAME
- The name of the project to use; overrides the name in the compose.yaml
fileDEFANG_ACCESS_TOKEN
- The access token to use for authentication; if not specified, uses token from defang login
DEFANG_BUILD_CONTEXT_LIMIT
- The maximum size of the build context when building container images; defaults to 100MiB
DEFANG_CD_BUCKET
- The S3 bucket to use for the BYOC CD pipeline; defaults to defang-cd-bucket-…
DEFANG_CD_IMAGE
- The image to use for the Continuous Deployment (CD) pipeline; defaults to public.ecr.aws/defang-io/cd:public-beta
DEFANG_DEBUG
- set this to 1
or true
to enable debug loggingDEFANG_DISABLE_ANALYTICS
- If set to true
, disables sending analytics to Defang; defaults to false
DEFANG_EDITOR
- The editor to launch after new project generation; defaults to code
(VS Code)DEFANG_FABRIC
- The address of the Defang Fabric to use; defaults to fabric-prod1.defang.dev
DEFANG_JSON
- If set to true
, outputs JSON instead of human-readable output; defaults to false
DEFANG_HIDE_HINTS
- If set to true
, hides hints in the CLI output; defaults to false
DEFANG_HIDE_UPDATE
- If set to true
, hides the update notification; defaults to false
DEFANG_ISSUER
- The OAuth2 issuer to use for authentication; defaults to https://auth.defang.io
DEFANG_MODEL_ID
- The model ID of the LLM to use for the generate/debug AI integration (Pro users only)DEFANG_NO_CACHE
- If set to true
, disables pull-through caching of container images; defaults to false
DEFANG_ORG
- The name of the organization to use; defaults to the user's GitHub nameDEFANG_PREFIX
- The prefix to use for all BYOC resources; defaults to Defang
DEFANG_PROVIDER
- The name of the cloud provider to use, auto
(default), aws
, digitalocean
, gcp
, or defang
DEFANG_PULUMI_BACKEND
- The Pulumi backend URL or "pulumi-cloud"
; defaults to a self-hosted backendDEFANG_PULUMI_DIR
- Run Pulumi from this folder, instead of spawning a cloud task; requires --debug
(BYOC only)DEFANG_PULUMI_VERSION
- Override the version of the Pulumi image to use (aws
provider only)NO_COLOR
- If set to any value, disables color output; by default, color output is enabled depending on the terminalPULUMI_ACCESS_TOKEN
- The Pulumi access token to use for authentication to Pulumi Cloud; see DEFANG_PULUMI_BACKEND
PULUMI_CONFIG_PASSPHRASE
- Passphrase used to generate a unique key for your stack, and configuration and encrypted state valuesTZ
- The timezone to use for log timestamps: an IANA TZ name like UTC
or Europe/Amsterdam
; defaults to Local
XDG_STATE_HOME
- The directory to use for storing state; defaults to ~/.local/state
At Defang we use the Nix package manager for our dev environment, in conjunction with DirEnv.
To get started quickly, install Nix and DirEnv, then create a .envrc
file to automatically load the Defang developer environment:
echo use flake >> .envrc
direnv allow
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by zed-industries
Provides real-time collaborative editing powered by Rust, enabling developers to edit code instantly across machines with a responsive, GPU-accelerated UI.
by cline
Provides autonomous coding assistance directly in the IDE, enabling file creation, editing, terminal command execution, browser interactions, and tool extension with user approval at each step.
by continuedev
Provides continuous AI assistance across IDEs, terminals, and CI pipelines, offering agents, chat, inline editing, and autocomplete to accelerate software development.
by github
Enables AI agents, assistants, and chatbots to interact with GitHub via natural‑language commands, providing read‑write access to repositories, issues, pull requests, workflows, security data and team activity.
by block
Automates engineering tasks by installing, executing, editing, and testing code using any large language model, providing end‑to‑end project building, debugging, workflow orchestration, and external API interaction.
by RooCodeInc
An autonomous coding agent that lives inside VS Code, capable of generating, refactoring, debugging code, managing files, running terminal commands, controlling a browser, and adapting its behavior through custom modes and instructions.
by lastmile-ai
A lightweight, composable framework for building AI agents using Model Context Protocol and simple workflow patterns.
by firebase
Provides a command‑line interface to manage, test, and deploy Firebase projects, covering hosting, databases, authentication, cloud functions, extensions, and CI/CD workflows.
by gptme
Empowers large language models to act as personal AI assistants directly inside the terminal, providing capabilities such as code execution, file manipulation, web browsing, vision, and interactive tool usage.