I configured the .databrickscfg to a host which I then authenticate with OAuth. I can successfully make use of the function Upload and Run File on Databricks within VSCode.
However, now I try to run this cell from a Jupyter notebook file in VScode:
from databricks.connect.session import DatabricksSession as SparkSession
spark = SparkSession.builder.getOrCreate()
which yields the following error:
default auth: cannot configure default credentials. Config: host=X, auth_type=metadata-service, cluster_id=Y. Env: DATABRICKS_HOST, DATABRICKS_AUTH_TYPE, DATABRICKS_CLUSTER_ID
I was expecting no difference between the Upload and Run File on Databricks and the Jupyter Notebook environment. Anyone any ideas what's going wrong?