I am trying to get access to my azure storage container json file to read json files (data). I have config my spark like this:
spark.conf.set(f"fs.azure.account.auth.type.{STORAGE_ACCOUNT_NAME}.dfs.core.windows.net", "SAS")
spark.conf.set(f"fs.azure.sas.token.provider.type.{STORAGE_ACCOUNT_NAME}.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider")
spark.conf.set(f"fs.azure.sas.fixed.token.{STORAGE_ACCOUNT_NAME}.dfs.core.windows.net", SAS_TOKEN)
logging.info('spark was configured')
When I try to connect I get the error: "Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature." (403).
data = get_data_from_file(BLOB_PATH)
/* The SAS token is correct, the file path is correct.
** Firewall is enabled in the networking setting in the container (Disabling it doesn't help).
*** If I try to read from a different storage it works, but why, I didn't do something in the settings.
There are multiple reasons for this issue:
wasbsand notabfss. I have tried using below code with mySAStoken and got the same error with my blob storage.To access files from azure blob storage where the firewall settings are
only from selected networks,VNetfor the Databricks workspace.service endpointsandsubnet delegationas following:Also check this SO answer for reference