1. Introduction
Google Cloud Platform (GCP) offers a rich array of services for running cloud applications, but harnessing them typically requires significant cloud expertise. Defang is a platform that bridges this gap by enabling developers to take an application described in a succinct Docker Compose file and deploy it to GCP with a single command. This whitepaper explores how Defang works with GCP, how it maps Docker Compose definitions to GCP resources, and how to integrate Defang into CI-CD workflows. We also delve into a case study of the Ask Defang Chatbot deployed on GCP and discuss future enhancements on the Defang roadmap.
2. What is Defang?
Defang is a cloud deployment tool for developers that takes your Docker Compose application, written in any language or using any stack, and deploys it to a secure and scalable configuration on your favorite cloud, specifically including GCP. Defang abstracts away the complexities of provisioning and managing cloud services. Instead of manually configuring GCP services or writing extensive Terraform/Pulumi scripts, you define your application (services, databases, etc.) in a standard Compose YAML file and Defang maps them to corresponding services on the target cloud platform.
Defang supports configuring domain names, advanced networking, scalable compute (including provisioning GPUs), managed storage options such as databases and caches, managed LLMs, and even building your project from sources. Defang also supports multiple deployment modes, optimized for cost, availability, or a combination. Under the hood, Defang uses a rules-based workflow to help generate optimized deployments that conform to GCP best practices and are reliable, repeatable, and predictable. This empowers small teams and individual developers to achieve cloud deployments without a dedicated DevOps effort, and allows larger teams to accelerate delivery by automating deployment workflows.
3. Why Use Defang to Deploy to GCP?
There are several reasons why Defang is the ideal tool for deploying applications to GCP.
3.1. Ease of Use
Single-Command Deployment: Defang provides a frictionless developer experience. Once your application is described in a Docker compose.yml, you can deploy it to GCP with a single command with the Defang CLI (e.g. defang compose up --provider=gcp). Defang will automatically build your containers and provision the necessary cloud infrastructure. This dramatically simplifies the usual multi-step GCP deployment process.
Compose Compatibility: Because Defang uses Docker Compose as the application definition, developers can leverage existing Docker Compose files or familiarity. There is no new proprietary format to learn – if you know how to describe services in Compose (images, ports, volumes, etc.), you can use Defang. This is especially attractive to teams that have already containerized their apps or use Compose for local development.
IDEs & AI Integration: Defang is integrated into modern development workflows, including AI-assisted coding tools and popular IDEs. Using Defang’s Model Context Protocol (MCP) Server, developers can even trigger deployments via natural language prompts from environments like VS Code, Cursor, Windsurf, or Claude Desktop. This means a developer can literally tell their AI pair-programmer “deploy my project to GCP,” and Defang’s backend will translate that into a deterministic, reliable deployment operation. For fast-moving teams and “vibe coders” experimenting with AI-built apps, this removes even the context switch of running CLI commands – deployment becomes part of the development conversation.
3.2. Flexibility
Any Language, Any Stack: Defang is language-agnostic. Your Compose file can define services in Node.js, Python, Go, Java, or any runtime – as long as it can be containerized, Defang can deploy it. This flexibility means teams are not constrained in their technology choices by the deployment platform. Polyglot architectures (e.g. a Python backend, Node frontend, Redis cache) are first-class citizens.
Use of GCP Credits & Native Resources: Using Defang with GCP doesn’t mean you leave the GCP ecosystem – on the contrary, Defang deploys everything into your own GCP account. This means you can leverage GCP’s Free Tier and any credits you have. For example, you might use GCP’s $300 new customer credit or startup credits to cover the resources Defang provisions. All resources run under your ownership, allowing you to use GCP’s billing, monitoring, and identity and security controls as usual.
Multi-Cloud Portability: While our focus here is GCP, it’s worth noting Defang’s cloud-agnostic design gives teams cloud portability. Startups especially value the option to deploy on different providers without rewriting their infrastructure code. A team could prototype on the free Defang Playground, then deploy to AWS for production, and later decide to migrate to GCP in order to take advantage of GCP features such as Vertex AI – all using the same Compose file. This flexibility assures organizations (especially enterprises wary of vendor lock-in) that they can avoid being tied to a single cloud’s deployment scripts or syntax, thereby enabling more customers to try GCP.
3.3. Quality (Best Practices: Security, Performance, Cost)
Defang doesn’t just aim to simplify deployments – it strives to conform to best practices of cloud architecture for you. Defang’s architecture and implementation have been reviewed and approved by GCP Solution Architects, ensuring that every configuration deployed by Defang conforms to GCP’s well-architected framework covering security, scalability, and cost-efficiency.
Security: Defang sets up cloud resources following the principle of least privilege and modern security standards. For example, on GCP it will create dedicated service accounts for deploying and running your application, with only the permissions required (such as Cloud Run Invoker, Storage access for code, etc.). Network security is handled by using Cloud Run’s built-in isolation - each service runs in its own secure sandbox with HTTPS endpoints by default. When custom domains are used, Defang provisions Google-managed SSL certificates automatically, so your traffic is encrypted end-to-end without extra effort. In Defang’s balanced and high_availability deployment modes, Cloud Run services by default are not exposed publicly unless you choose to, and only services with ingress are exposed via a public load balancer. Additionally, Defang manages secrets and config through the Google Secret Manager. The net effect is that even small teams automatically deploy in a secure manner akin to what an experienced cloud architect would configure.
Performance & Scalability: Applications deployed via Defang on GCP benefit from the scalable architecture of Cloud Run. Defang will deploy each service as a Cloud Run service (a serverless container) which can auto-scale up based on request load and scale down to zero when idle. This means your app can handle bursts of traffic – Cloud Run will spin up more container instances as needed. And conversely, when traffic subsides, Cloud Run frees up resources, saving cost. Defang can also take advantage of new features like GPU support in Cloud Run, bringing serverless elasticity to machine learning and inferencing workloads.
Cost Efficiency: By automating resource selection and scaling, Defang helps enforce cost-effective practices. It chooses managed services with pay-per-use pricing (Cloud Run, Cloud SQL, etc.), so you generally pay only for what you use. Defang’s deployment modes also allow tuning for cost: e.g. in the affordable mode, Defang uses smaller instance sizes or preemptible/spot instances to save money. By contrast, high_availability mode uses on-demand, more resilient resources for reliability, thereby allowing you to choose the appropriate mode for development versus production deployments. Additionally, Defang is introducing tooling for cost estimation: you can run defang estimate to get a projection of what your deployment would cost before you deploy. This is especially useful for enterprises to budget cloud expenses and avoid accidentally provisioning something beyond their budget.
Overall, using Defang to deploy to GCP provides ease, flexibility, and quality: it simplifies deployment to a one-liner, supports essentially any tech stack on GCP, and applies Google’s cloud best practices for you.
4. How Defang Maps Your App to GCP Resources
Defang employs an optimized mapping of your Compose-defined application to GCP’s services and resources. In this section, we break down how Defang translates various aspects of your project into GCP resources, and how these may differ according to the deployment mode.
4.1. Building Your Project (Container Builds)
When you trigger a Defang deployment to GCP, one of the first steps is building your application’s container images. Defang handles this by running Cloud Build in your project. Defang packages your source code (excluding ignored files) and uploads it to a GCS bucket it creates (often named something like defang-cd-…). Then it uses GCP’s Cloud Build to compile the Docker images as specified by your Compose file.
Defang’s use of Cloud Build for building images ensures the build process is repeatable and not dependent on the environment on the developer’s machine. The build process is serverless and scaled appropriately according to the deployment mode - in affordable mode, it allocates fewer CPUs (to save cost), whereas in high_availability mode it will allocate more CPUs to increase speed. The built images are pushed to the GCP Artifact Registry in your account. By automating container builds and deployment, Defang saves you from having to run docker build, manage Docker registries, or write CI scripts for building images.
4.2. Security: Accounts, Roles, Secrets, and Best Practices
Defang sets up a secure foundation on GCP so that your app runs with the correct identity, minimal privileges, and with secure management of sensitive values. These settings have gone through a review with GCP architects to ensure they conform to GCP security best practices.
-
Cloud Account Isolation: All resources live in your GCP project that you control access to. Defang does not keep persistent access to your cloud beyond what is needed to create resources. When you authenticate via
gcloud auth
and run Defang, it uses your Application Default Credentials to act on your behalf. Thereafter, those resources are under your control. Defang is not running your infrastructure, it is just provisioning it. -
Service Accounts and IAM Roles: Defang creates a Build service account which is used to build the images as described above. Defang also creates a CD service account which is used to provision the resources that are part of the app deployment. These service accounts are granted only the permissions needed. For instance, the Build service account has Cloud Run Builder permissions, while the CD service account has Cloud Run Admin permissions to deploy Cloud Run services. There’s no Defang root access to your project – you could remove Defang’s rights after deployment and it would not impact your deployed app. This model is similar to using Terraform: you give it credentials to create resources, but the end state is just normal cloud resources.
-
Secrets Management: Defang allows you to specify sensitive configuration values (such as API keys, passwords etc.) either via environment variables or via Defang’s configuration system. On GCP, Defang securely manages all such sensitive config information in the GCP Secret Manager. These configuration values are then made available to the application as environment variables at run time.
-
Secure Defaults: Defang aligns with many GCP secure-by-default features. For example, Cloud Run services are HTTPS by default with Google-managed certificates - all HTTP traffic will be redirected to the corresponding HTTPS endpoints. Defang generated service accounts follow GCP’s IAM security best practices such as least privilege and avoiding creation of service account keys. Defang also sets up firewall rules to only allow access from services of the same project, and from public load balancers. When Defang deploys services using the Google Cloud Compute Engine, it uses the Google Container-Optimized-OS which optimizes efficiency and security.
-
Auditability: Because Defang uses your GCP project, all actions it takes can be logged in GCP’s Cloud Audit Logs. You can see when service accounts were created, when Cloud Run services were deployed, etc. This is useful for compliance – you have a record in GCP of infrastructure changes, even though Defang initiated them. Enterprises can integrate this with their auditing pipelines.
-
Updates and Patch Management: With Defang using managed services, a lot of security patching is offloaded to the cloud provider. For instance, Cloud Run’s underlying nodes are patched by Google; Cloud SQL Postgres is patched by Google; Memorystore Redis is patched by Google. You only maintain your application code and containers, which you can rebuild easily via Defang. If a vulnerability comes out, you can update your base image, run defang compose up again, and all services will be redeployed with the patched image (with zero-downtime if in high_availability mode).
-
Data Security: Managed storage (Cloud SQL, etc.) provides encryption at rest by default. In high_availability mode, Defang enables features like encryption and snapshots on deletion for databases to prevent data loss and comply with data retention requirements. Defang applies similar principles to Managed Redis caches also.
-
Application Security: Defang doesn’t directly secure your code (you still must write safe code), but it encourages 12-factor practices. For example, it externalizes config (through env vars and managed services) so you don’t hardcode secrets. It supports staging environments so you can test before deploying to production. Defang also makes it trivial to spin up test environments for QA in an isolated cloud project.
4.3. Domain Names
Defang allows you to bring your own domain for your application, so that your services can be accessed at friendly URLs (e.g. api.mycompany.com). On GCP, Defang supports automating this: you specify the domainname in your Compose file, and Defang would then provide instructions on where to point the domain DNS record to finish the SSL certificate issuance authorization. On GCP, once the domain DNS is set up correctly, the defang CLI will indicate the setup is correct, and GCP certificate manager will issue the SSL certificate using the load balancer authorization to issue the certificate.
services:
web:
image: myapp/web:latest
domainname: nextjs.defang.chewydemos.com
ports:
- target: 3000
mode: ingress
If you deploy to GCP without a custom domain, your services will get the Defang delegate domain (e.g. xxx.defang.app) and, in the affordable deployment mode, also the default Cloud Run domain (e.g. https://<service>-<random>-uc.a.run.app
).
4.4. Networking
Networking in a cloud environment can be complex, involving VPCs, subnets, gateways, firewalls etc. Defang shields you from most of this complexity. As part of its Compose support, Defang implements the semantics according to the Docker Compose Networks specification. In the GCP case, Defang deploys Cloud Run services to a VPC with one or more subnets behind a public Cloud Load Balancer.
If your Compose file has multiple services that need to talk to each other, Defang will create a Private DNS Zone in Cloud DNS attached to the VPC mentioned above to ensure they can discover each other. Each service is added as servicename.google.internal, and other services can look up "servicename" to get the corresponding internal IP address. Defang creates internal load balancers to facilitate inter-service communication. Defang also sets up Google Private Service Connect to allow application services to access managed services such as databases and caches.
4.5. Compute (Cloud Run, Auto-Scaling, GPUs)
Generally, each service in your Compose file becomes a Cloud Run service under the hood. Cloud Run is a fully managed container execution environment that abstracts away servers. Defang chooses Cloud Run because it fits the “serverless, scalable by default” model. Defang creates a service account per service that is part of the app. These service accounts have permissions to access other GCP services that are part of the app. Defang ensures that Cloud Run will run your container image with the settings derived from the Compose file. For example, the Compose ports you specify inform how Defang maps traffic: in Compose you might expose port 5000, and Defang will ensure Cloud Run knows to listen on 5000 (Cloud Run actually always listens on $PORT
env var internally, but Defang sets that up).
Cloud Run automatically provides certain benefits: HTTPS endpoints, auto-scaling, scale-to-zero, concurrency control (how many requests per instance), and revisions for rollbacks. Defang leverages these. If you set x-defang-autoscaling: true for a service, Defang will enable horizontal scaling for that Cloud Run service based on CPU usage. If you do not want autoscaling, you can pin a number of replicas and Defang would then deploy Cloud Run with a min and max equal to that count.
In a couple of cases - when either the service uses a host mode port (network_mode: "host") or the service has more than 1 port - Defang uses Compute Engine instead. It creates Compute instances according to the resource requirements and schedules containers on them.
GPUs: With GPU support now generally available on Cloud Run, Defang can deploy GPU workloads seamlessly. If your service needs a GPU (for example, to run a PyTorch model or an LLM inference server), you would indicate that in your Compose file per the specification. Defang then ensures Cloud Run will schedule that container on a node with an appropriate GPU. Note: You might be limited to specific regions where GPU support is available as per GCP’s documentation. Defang will choose the region you have set via the GCP_LOCATION environment variable, so you should pick a region that supports GPUs if you intend to use them.
4.6. Managed Storage (Postgres and Redis)
Many applications need stateful services like databases and caches. Defang offers Managed Storage extensions so that instead of running, say, a Postgres database in a container, you can have a cloud-managed database provisioned automatically. The way this works in Compose is by using extension fields on a service that indicate it should be managed, not containerized.
Managed Postgres: If your Compose file has a service using the official Postgres image, you can add x-defang-postgres: true to that service’s definition. When Defang sees that and you deploy to GCP, it will provision a Cloud SQL for PostgreSQL instance. Defang takes care of setting up the Cloud SQL instance in your project, including managing the password and connection info. It will then replace the service in your app with connection environment variables so that your other services (e.g. backend API) can connect to the Cloud SQL instance.
Managed Redis: Similarly, Defang supports x-defang-redis: true
on a Redis service. Instead of running a Redis container, Defang will provision a GCP MemoryStore Redis (a fully managed Redis service) in your project and provide your app the connection endpoint.
Connection Details: When Defang provisions a managed database, it injects the necessary connection info into your services. Typically, you’d get environment variables such as POSTGRES_DB, POSTGRES_USER, POSTGRES_PASSWORD, for Postgres, or a Redis URL. The actual values are stored securely using Defang’s configuration handling.
The advantage of using x-defang-postgres
or x-defang-redis
is huge for production deployments: you get a database with proper durability and scaling. For example, Cloud SQL Postgres offers automatic storage increase, point-in-time recovery, etc. By letting Defang manage Postgres, you ensure your database uses GCP’s fully managed service with high quality-of-service. Defang even handles version upgrades in a safe way.
4.7. Managed LLMs – Vertex AI Integration
One of Defang’s unique features is built-in support for Managed Large Language Models (LLMs). As AI models and services become an integral part of modern apps, Defang provides an easy path to integrate cloud-native AI services such as Google’s Vertex AI on GCP. Defang offers an extension x-defang-llm that you can add to any service in your Compose file that uses Vertex AI’s SDKs or APIs. By marking a service with x-defang-llm
, Defang knows that this service is intended to use a managed LLM, and it will configure the environment appropriately. In the context of GCP, this entails a few things:
-
Ensuring the service’s GCP credentials or IAM permissions allow it to invoke Vertex AI. This includes assigning the Vertex AI User role to the service’s account.
-
Setting up supporting infrastructure: Vertex AI typically doesn’t require provisioning (it’s a managed API), Defang’s LLM support introduces a sidecar that translates OpenAI-compatible API calls to Vertex AI, mapping any LLM model IDs in the process.
This is powerful for AI application developers: suppose you have a chat application that uses OpenAI’s API locally. With Defang, you could instead use Vertex AI’s PaLM API in production on GCP – you’d mark your service with x-defang-llm
, provide the model name, and Defang ensures your deployed service can reach Vertex AI and is configured to use it.
4.8. Logs / Observability
Observability is crucial in any deployment. Defang provides multiple ways to view and analyze your logs and app status:
-
Defang CLI (Tailing Logs): Right after you deploy, the Defang CLI automatically attaches to your service logs and streams them in your terminal. You will see the build logs (output of Cloud Build steps) and then the runtime logs from your application containers. This immediate feedback is great for development – you know whether your app started successfully on Cloud Run and can see if it’s handling requests.
-
GCP Cloud Logging: When your app runs on GCP (Cloud Run), all stdout/stderr from containers go to Google Cloud Logging by default. That means you can always go to the GCP Console, open Cloud Run’s Logs view or the Logs Explorer, and see everything. As Defang uses your GCP account and project, you have full access to these logs. This is crucial for enterprise teams: all logs remain in your environment, and you can integrate them with Cloud Logging sinks, BigQuery, SIEM systems, etc.
-
defang tail: The Defang CLI offers a defang tail command to fetch logs from your GCP deployments on demand. This command essentially acts like gcloud logs tail under the hood. It connects to GCP’s logging and continuously streams logs to your console. You can even specify <service> to filter by service name.
In addition, Defang also includes an AI Debugger feature. When you deploy a project (using defang compose up), the Defang CLI will wait for all services' statuses to switch to healthy.
If any service fails to deploy (because of a build failure, health-checks failing, or a variety of other issues), the AI debugger will kick in and ask for permission to analyze the logs and files in your project to identify any issues. It will then provide you with the suggested fixes in the terminal. While not a replacement for traditional monitoring, it can speed up debugging in development and even in production issues if used appropriately.
4.9. Deployment Modes
Defang currently supports three deployment modes: affordable, balanced, and high_availability. These modes primarily differ in how resources are allocated and how deployments are executed, allowing you to tailor cost vs. resilience:
-
affordable mode is optimized for quick iteration and lower cost, intended for development environments For example, builds use minimal resources (2 vCPU build instances) and deployments are not highly available. Defang will use cheaper resources (like spot instances) and tolerate downtime during updates (it will tear down and recreate services). Logs are only retained for a short time (1 day).
-
balanced mode is meant to provide a mid-way point between affordable and high_availability modes, intended for staging environments. It will use high_availability-like settings for reliability (e.g. rolling updates for deployments) while keeping some limits slightly lower. balanced mode deployments use on-demand instances and mimic high_availability networking (e.g., ensuring a NAT gateway or proper DNS) so that any issues in a production environment can be caught here first. Logs are kept for 7 days for troubleshooting.
-
high_availability mode maximizes performance, security, and uptime, intended for production environments. Builds run with more CPU (e.g., 4 vCPUs) for faster delivery of containers. When deploying updates, high_availability mode uses rolling updates with zero-downtime: new versions are spun up and old versions are only terminated once the new ones are ready, ensuring continuity. More conservative configurations are applied: for example, databases are provisioned on larger instance classes optimized for production loads, and termination protection or final snapshots are enabled on shutdown of services to prevent data loss. Security is tight: encryption at rest is enforced for managed storage, and any destructive operations (like replacing a database instance) are handled carefully to prevent unintended data loss. Logs are retained longer (e.g. 30 days) for compliance and audit purposes.
5. Integrating Defang Deployments into CI/CD Workflows
Defang can be seamlessly integrated into your continuous integration / continuous deployment (CI/CD) pipelines so that deployments to GCP occur automatically on code changes or merges. Two notable integration paths are using GitHub Actions and the Defang Pulumi Provider.
5.1. GitHub Actions
GitHub Actions is a popular CI/CD service, and Defang provides an official Action to deploy projects as part of your workflow. This allows, for example, automatically running defang compose up --provider=gcp whenever you push to the main branch or create a release.
This approach has several benefits:
- Consistency: The same deployment logic runs in CI as you would run locally, reducing “it works on my machine” issues. The Compose file is the single source of truth.
- Speed: Defang’s rapid deployment means your CI pipeline doesn’t need to handle the nitty-gritty of provisioning; it just calls Defang. And since Defang can do zero-downtime deploys, you can push updates frequently.
- Rollback: If something fails, you can run a GitHub Action to deploy a previous version (since the Compose file can reference a specific image tag or you can keep older images in the container registry).
5.2. Pulumi Provider
Pulumi is an Infrastructure-as-Code (IaC) platform where you write code in languages like Go, Python, TypeScript, etc. to define cloud infrastructure. Recognizing that some users need the full power of IaC programming, Defang created a Pulumi Provider. This allows Pulumi code to call Defang and deploy a Compose project as part of a Pulumi stack.
What does that mean? Imagine you are an enterprise that already uses Pulumi to manage your cloud (VPCs, some databases, etc.). With the Defang Pulumi provider, you can incorporate a Defang deployment into that workflow. For example, you could have a Pulumi program that sets up some prerequisite infrastructure (say a BigQuery dataset), and then use the Defang provider to deploy your app so it connects to those resources.
6. GCP + Defang in Action – Defang Chatbot on GCP
Defang uses Defang to deploy its Ask Defang chatbot on GCP. This application uses a number of features described previously: domainname, managed redis (via the x-defang-redis) extension, as well as the managed LLM feature (via the x-defang-llm extension). You can find the source code for the application, including the Compose file, here. This example illustrates the power of Defang+GCP in deploying and orchestrating a complex multi-service application using less than 100 lines of Compose and utilizing Cloud Run, MemoryStore Redis, Vertex AI, and other advanced networking and security features of GCP.
7. Future Enhancements
Defang is rapidly evolving. The Defang team has an active roadmap to further expand its capabilities on GCP and other clouds. Here are some future enhancements in the pipeline that will make GCP + Defang even more powerful:
More Storage Options (MongoDB, Object Storage, Volumes, etc.)
As discussed, support for additional managed services is growing. One confirmed upcoming feature in the near future is managed MongoDB. This means applications using MongoDB can get a fully managed MongoDB instance without running it in a container.
Object storage integration is another key addition in the future. This would allow applications that use an object store (e.g. Google Cloud Storage) to use Defang to either create a new bucket and/or configure access to an existing bucket so your app can read/write the bucket.
Persistent volumes support will address apps that require file system persistence (beyond what a single container’s ephemeral disk can do). For example, if you have a legacy service that writes to disk or an analytics app that uses local disk, Defang might enable volumes by provisioning a network file system (like Google Filestore) and mounting it to the container.
Cost Estimation – Know Before You Deploy
No one likes a surprise cloud bill. Defang is working on a defang estimate
command to estimate deployment costs in advance of deployment. Defang would analyze your Compose file and GCP pricing to provide an estimated monthly cost for running the app. Such a tool allows teams to choose deployment modes that best suit their needs - e.g. a startup might choose to run in balanced mode and only upgrade to high_availability mode when needed.
8. Conclusion
Defang’s integration with GCP brings together the best of both worlds – the simplicity of Docker Compose and the robustness of GCP’s cloud services. By using Defang, development teams can deploy complex, scalable applications to GCP with a fraction of the effort traditionally required, all while adhering to best practices in security, performance, and cost management. The partnership of Defang and GCP is particularly empowering for startups aiming to iterate quickly without DevOps overhead, and for enterprise teams seeking consistency and compliance in deployments, With ongoing enhancements like expanded managed services, AI integration, and cost estimation, Defang is poised to become a key accelerator for cloud applications on GCP, turning cloud deployment into a highly automated, simple, secure, and developer-friendly experience.