For teams building with CrewAI

Deploy CrewAI agents to your cloud without a DevOps team

Bring your existing CrewAI code and Docker Compose file. Defang turns it into a secure, production-ready deployment on AWS or GCP in a single command.

Docker Compose-native
Built for CrewAI stacks
BYOC on AWS or GCP
CLI

Deploy the stack from your repo in one command.

defang compose up --provider=aws
Defang launches you to any cloud
Diagram showing Defang deploying a multi-service CrewAI application into the cloud.
TLS

HTTPS, secrets, env wiring

Included by default when you deploy

Stack fit

Built for multi-agent CrewAI architectures

CrewAI applications rarely stop at a single process. You have web frontends, background workers, brokers, and databases, all of which need to be wired together across environments. Defang gives you a simple path from your docker-compose.yml to secure, repeatable cloud deployments.

Multi-agent crews + web workersKeep your Python architecture
01

Describe your stack once

Express your Django/FastAPI app, CrewAI workers, Redis, and Postgres in a single Compose file. Defang uses that same description for local dev, staging, and production so you do not have to maintain parallel configurations.

02

Use managed inference in your cloud

Because Defang deploys your CrewAI services inside your own AWS or GCP account, your agents can call managed inference such as Amazon Bedrock or Vertex AI using existing IAM, networking, and billing instead of routing requests through third party gateways.

03

Agent‑aware operations

Use Defang's Model Context Protocol (MCP) server to let AI coding agents in your IDE deploy and manage CrewAI services: check statuses, redeploy, or destroy test environments from a chat window.

Path to production

How it works

You keep building your Crews and Flows in Python. Defang uses Docker Compose and a small CLI to handle everything from image builds to cloud resources.

1

Containerise your CrewAI stack

Add Dockerfiles for your web app and workers, and a docker-compose.yml that defines how they run together with Redis and Postgres. If you prefer, start from the CrewAI + Django + Redis + Postgres sample and customise.

2

Deploy with a single command

From your repo root, run:

defang compose up --provider=aws

Defang builds images, provisions cloud resources, configures TLS, and exposes your CrewAI app at a stable, HTTPS URL.

3

Operate and iterate safely

Use Defang CLI and MCP tools to inspect services, roll out new versions, or destroy test environments. Because environments are defined in Compose, spinning up staging or per‑customer stacks is straightforward.

Start from a working CrewAI + Defang template

The official sample pairs CrewAI with Django, Celery, Redis, and Postgres, and ships with a one‑click "Deploy with Defang" button. It's a practical reference for how to structure real‑time multi‑agent workloads on top of Defang.

Open the CrewAI sample on GitHubFork it, swap in your own Crews and Flows, and keep the same deployment pattern.

FAQ