I am trying to run the code to mount the folder in Azure Blob Storage. I tried with another blob storage account with enabled soft delete, and it still works. But when I create another blob storage account and try to mount it to that blob folder, it does not work because my blob has been activated with the soft delete feature. Any setting is needed to complete this task, but not disable the soft delete setting in blob storage data protection. Please let me know if you guys have the solution. Many thanks.
Pic1: blob storage (dlsbdpdwhtest) that I can mount but did not disable the soft delete dlsbdpdwhtest account
setting for this account dlsbdpdwhtest data protection setting
Pic2: blob storage (dlsbdpdwhdev) that I cannot mount dlsbdpdwhdev account
setting for this account dlsbdpdwhdev data protection setting
Any setting that I need to check before running the code for mounting the folder in the blob storage setting
Here is the code for mounting
configs = {"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id": bdp_datalake_access_application_id,
"fs.azure.account.oauth2.client.secret": bdp_datalake_access_secret_value ,
"fs.azure.account.oauth2.client.endpoint": bdp_datalake_endpoint }
mount_source = f"abfss://[email protected]/"
mount_folder = "/mnt/bdp-dwh"
print(mount_source)
dbutils.fs.mount(source = mount_source,mount_point = mount_folder,extra_configs = configs)
In your Storage account, make sure the Blob Soft Delete option is Disabled. If it's enabled, follow the next step to learn how to disable it.
For details review this page
https://ganeshchandrasekaran.com/azure-databricks-configure-your-storage-container-to-load-and-write-data-to-azure-object-storage-3db8cd506a25