'Error setting access control for Path using terraform module "azurerm_storage_data_lake_gen2_path"
im trying to add new AD Object ID in ACL using ace block for an already existing subfolders inside the azure datalake container using terraform code and im getting the below error. Note : if i deploy newly or after destroy it works, but when updating it gives me error.
Error setting access control for Path "subfolder1" in File System "test-container" in Storage Account "testdatalakestorage": datalakestore.Client#SetAccessControl: Failure responding to request: StatusCode=403 -- Original Error: autorest/azure: Service returned an error. Status=403 Code="AuthorizationPermissionMismatch" Message="This request is not authorized to perform this operation using this permission.\nRequestId:a####e2-###-0##7-1##0-5##########00\nTime:XXXXX XXXX"
Need help in fixing the same, bellow is the terraform version & code im testing.
terraform-version = "v0.14.4" azurerm version = "2.70.0"
Terraform code to create the datalake filesystem
resource "azurerm_storage_data_lake_gen2_filesystem" "test-containers" {
name = "test-container"
storage_account_id = module.storage.id
# Access Permission ACL for Container/Root Directory
ace {
# no id => owning user (i.e. deployment account)
permissions = "rwx"
scope = "access"
type = "user"
}
ace {
permissions = "rwx"
scope = "access"
type = "group"
}
ace {
type = "group"
scope = "access"
id = XXXXXX-XXXX-XXXX-XXXX
permissions = "r-x"
}
ace {
type = "group"
scope = "access"
id = XXXXXX-XXXX-XXXX-XXXX
permissions = "r-x"
ace {
permissions = "rwx"
scope = "access"
type = "mask"
}
ace {
permissions = "---"
scope = "access"
type = "other"
}
ace {
permissions = "rwx"
scope = "default"
type = "user"
}
ace {
permissions = "rwx"
scope = "default"
type = "group"
}
ace {
permissions = "rwx"
scope = "default"
type = "mask"
}
ace {
permissions = "---"
scope = "default"
type = "other"
}
}
Terraform code to create the subfolder1 inside datalake filesystem and apply ACL
resource "azurerm_storage_data_lake_gen2_path" "subfolder1" {
path = "subfolder1"
filesystem_name = azurerm_storage_data_lake_gen2_filesystem.test-containers.name
storage_account_id = module.storage.id
resource = "directory"
ace {
# no id => owning user (i.e. deployment account)
permissions = "rwx"
scope = "access"
type = "user"
}
ace {
permissions = "rwx"
scope = "access"
type = "group"
}
ace {
type = "group"
scope = "access"
id = XXXXXX-XXXX-XXXX-XXXX
permissions = "r-x"
}
ace {
type = "group"
scope = "access"
id = XXXXXX-XXXX-XXXX-XXXX
permissions = "r-x"
}
ace {
permissions = "rwx"
scope = "access"
type = "mask"
}
ace {
permissions = "---"
scope = "access"
type = "other"
}
ace {
permissions = "rwx"
scope = "default"
type = "user"
}
ace {
permissions = "rwx"
scope = "default"
type = "group"
}
ace {
permissions = "rwx"
scope = "default"
type = "mask"
}
ace {
permissions = "---"
scope = "default"
type = "other"
}
}
Solution 1:[1]
Usually, the resource which we use to perform deployment should have some storage specific roles granted. Some of the built-ins roles that can be attributed are Storage Account Contributor
, Storage Blob Data Owner
, Storage Blob Data Contributor
, Storage Blob Data Reader
.
please check which of above roles were granted in your case else Assing the storage specific role to the service principal.
Please go to Storage account -> Access Control -> Add -> Add role assignment, then add storage specific roles to your service principle.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | PratikLad-MT |