Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow DABs to dynamically pass in host target #2095

Open
Gijsreyn opened this issue Jan 8, 2025 · 2 comments
Open

Allow DABs to dynamically pass in host target #2095

Gijsreyn opened this issue Jan 8, 2025 · 2 comments
Labels
DABs DABs related issues Enhancement New feature or request

Comments

@Gijsreyn
Copy link

Gijsreyn commented Jan 8, 2025

Describe the issue

As a user, I want to be able to pass in the host dynamically on the specific target. This allows me to combine both IaC (Bicep) output when deploying Azure Databricks with DABs in one DevOps pipeline.

Configuration

bundle:
  name: test
include:
  - resources/*.yml

variables:
  hostName:
    description: The hostname of the Databricks workspace
    type: string


targets:
  dev:
    mode: development
    default: true
    workspace:
      host: ${var.hostName}

Steps to reproduce the behavior

  1. Run databricks configure
  2. Run databricks bundle validate -t 'dev' --var=hostName=<outputFromAzureDeployment>

The hostname cannot be replaced.

OS and CLI version

Databricks CLI v0.233.0

Is this a regression?

No.

@haveyoumetcp
Copy link

Hey @Gijsreyn I was having the same issue, but I wanted to provide a pretty seamless workaround I did. I'm using Azure DevOps pipelines and Terraform to deploy Azure Databricks and DAB.

  1. Use TF to create the DB workspace
  2. Put the DB host Url in a key vault
  3. In the AzDo pipeline job for deploying DAB, get that value from the Key Vault
  4. Now this is the fun part, in the databricks.yml file don't put a workspace...host value. It will then try to read from the .databrickscfg file in the current users home directory. So, what I did was to create the .databrickscfg with only a [DEFAULT] value with the host set to the URL from the key vault. And that's working perfectly in my testing so far.

There may be a way to do this without a Key Vault, but we use them pretty extensively so it just worked out.

Here's the yaml for #4

- task: AzureKeyVault@2
  displayName: 'Retrieve Host Value from Azure Key Vault'
  inputs:
    azureSubscription: ${{ parameters.azureSubscription }}
    KeyVaultName: ${{ parameters.keyVaultName }}
    SecretsFilter: $(databricksHostSecretName)
    RunAsPreJob: true

- task: Bash@3
  displayName: 'Create .databrickscfg with host value from Azure Key Vault'
  inputs:
    targetType: 'inline'
    script: |
      echo "[DEFAULT]" > ~/.databrickscfg
      echo "host = $(DatabricksHostUrl)" >> ~/.databrickscfg
  env:
    DatabricksHostUrl: $(DatabricksHostUrl)

@Gijsreyn
Copy link
Author

Gijsreyn commented Mar 5, 2025

Hey @haveyoumetcp , thank you very much for putting such detail as workaround.

I'm doing somewhat the same, but using the PowerShell module yayaml to do a dynamic replace during pipeline run :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
DABs DABs related issues Enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants