Hi @Janice Chi
Thanks for outling your scenario. Given your move toward a multi-environment CI/CD model for healthcare-grade pipelines, it's great to see you're evaluating Databricks Asset Bundles (DAB) alongside your existing Terraform-based infrastructure.
Is DAB the recommended approach for multi-environment Databricks deployments?
Yes – Databricks Asset Bundles (DAB) are now the recommended and supported approach by Databricks for modular, code-driven deployments of notebooks, jobs, workflows, and environment-specific configurations.
They integrate well with Azure DevOps, GitHub Actions, and other CI/CD tools, making them suitable for structured promotion across Dev → QA → PreProd → Prod.
How does DAB compare with Terraform?
Area | Databricks Asset Bundles (DAB) | Terraform for Databricks |
---|---|---|
Primary Purpose | Application deployment (jobs, workflows, code) | Infrastructure provisioning (clusters, ACLs, etc.) |
CI/CD Integration | Built-in support (e.g., bundle deploy , bundle run ) |
Requires external scripting |
Environment Overrides | Native support via targets: in bundle.yml |
Requires variable files and conditional logic |
Secrets/Mounts | Not supported (must be handled separately) | Can provision secrets, scopes, mounts |
Unity Catalog Support | Fully supported in latest DAB versions | Supported |
Versioning | Git-based, declarative (via repo + DAB CLI) | Declarative (via .tf files) |
Deployment to Repos | Supports linking to DBR Repos as source | Terraform supports repos via API or init scripts |
DAB and Terraform are complementary - not competing. Most customers use:
- Terraform to provision infrastructure
- DAB to deploy jobs, notebooks, workflows, and environment-specific configurations
How does DAB work with DBR Repos?
- DAB can deploy to Databricks Repos directly, allowing developers to version notebooks in Git and sync them with the workspace.
- You can also define repos as dependencies or sources in
bundle.yml
. - Supports workflows that run code from repos, not just workspace paths.
Healthcare Considerations & Best Practices
Given the regulatory and audit requirements:
- Keep secrets (e.g., connection strings, SHIR names) outside the bundle, injected at deploy/runtime via:
- Azure Key Vault integration
- Pipeline environment variables
run_id
,env
,git_commit
for audit traceability. - Use target overrides in
bundle.yml
to:- Point to different SQL endpoints
- Change notebook parameters
- Control retry logic per environment
Known DAB Limitations
Doesn’t manage:
- Secrets (must use Terraform or CLI)
- Mount points
- Unity Catalog permissions (use Terraform or REST APIs)
No direct support for Delta Live Tables (DLT) yet
Testing for large bundles (>100 jobs) may need careful orchestration
Useful Docs:
Hope this helps. If this answers your query, do click Accept Answer
and Yes
for was this answer helpful. And, if you have any further query do let us know.