Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ISSUE] Issue with databricks_credential resource #4330

Open
DanStutz opened this issue Dec 17, 2024 · 2 comments
Open

[ISSUE] Issue with databricks_credential resource #4330

DanStutz opened this issue Dec 17, 2024 · 2 comments

Comments

@DanStutz
Copy link

DanStutz commented Dec 17, 2024

Configuration

AWS Deployment

resource "databricks_credential" "dbx_credential" {
  provider = databricks.workspace
  name            = var.dbx_credential_name
  aws_iam_role {
    role_arn = var.aws_iam_role_arn
  }
  purpose         = var.credential_type
  comment         = "Managed by TF: ${var.credential_comment}"
  isolation_mode  = var.credential_isolation_mode
  skip_validation = true
}

resource "databricks_grants" "dbx_creds_grants" {
  provider = databricks.workspace
  credential       = databricks_credential.dbx_credential.id
  dynamic "grant" {
    for_each       = var.credential_principals
    content {
      principal    = grant.value
      privileges   = var.credential_privileges
    }
  }
}

# Module call to above code

module "firehose_dbx_service_credentials" {
  providers = {
    databricks.workspace = databricks.workspace
  }
  source                    = "../credentials"
  dbx_credential_name       = "dbx-firehose-role-${var.environment}"
  credential_type           = "SERVICE"
  credential_comment        = "For use to log data to Firehose from databricks"
  aws_iam_role_arn          = aws_iam_role.assume_firehose_role.arn
  credential_principals     = var.service_credential_principals
  credential_privileges     = ["ACCESS", "CREATE CONNECTION"]
  credential_isolation_mode = "ISOLATION_MODE_ISOLATED"
}

Expected Behavior

I expected the resource to create successfully as this is how the resource is defined in the documentation. By setting the isolation mode to ISOLATION_MODE_ISOLATED i expected the service credential to create and be locked down to only be used by the workspace that I passed in to the configuration via the workspace provider. This is using the workspace level provider to provision the resources.

Actual Behavior

What actually happened is I received the below error in my TF apply where it says that SERVICE_CREDENTIAL is not a valid securable type:
image

After the apply ran, now my TF plan won't even work as it is searching for the service credential in DBX and it cannot find it as it does not exist and is "not accessible in the current workspace" but it should've never created those resources in the first place due to the error. Additionally, this is all deployed through a GitLab pipeline, so the location where the code is being deployed from has not changed. Below is a screenshot of this, the words that are redacted is just the name of the DBX service credential I wanted to create, for some reason it is searching for that. I also commented out the code block above to see if it would try and destroy the credential but it gave the error in the screenshot below (very strange from a TF perspective).

image

Steps to Reproduce

  1. Copy the above code and give it the required input values
  2. Run a TF plan, it should plan out the deployment successfully
  3. Run a TF apply to try and create the resources, it should fail with the above error in the first screenshot
  4. Try to run another TF plan, it will then throw the error in the second screenshot

Terraform and provider versions

version = "~> 1.61.0"

Is it a regression?

No this is a brand new feature that was added in the 1.60.0 release of the Databricks TF provider

Important Factoids

No

Would you like to implement a fix?

No

@michaelvonderbecke
Copy link

I ran into this same issue, it appears to actually create the service credential but failed to bind it to the workspace that the deployment is happening through (as my terraform is isolating the service cred to the current workspace). If I then go into the UI and add the workspace terraform is connecting through to the list of workspaces the SC is bound to, then the terraform is able to run a plan again, but then it says the credential is tainted and needs to be recreated

@alexott
Copy link
Contributor

alexott commented Dec 19, 2024

Please upgrade to 1.62.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants