Change port of databricks-connect to 443

74 Views Asked by At

I have a Databricks cluster with 10.4 runtime, when I configure databricks-connect configure I put all the information needed and using the default port 15001, databricks-connect test works.

But changing the port to 443 does not work, I tried to do a pyspark session but it also fails:

: com.databricks.service.SparkServiceConnectionException: Invalid port (443) or invalid token:

The port you specified is either being used already or invalid.
Port: The port that Databricks Connect connects to
  - The default Databricks Connect port is 15001
  - Get current value: spark.conf.get("spark.databricks.service.port")
  - Set via conf: spark.conf.set("spark.databricks.service.port", <your port>")

I added a configuration on my cluster (spark config)

spark.databricks.service.server.enabled true

spark.databricks.service.port 443

But it did not work.

Why doesn't it work? The reason why I want to change to port 443 is simple, my github action workflows is on a self-hosted worker which is behind a firewall that block the port 15001, so I'm trying to find a solution to use a port which is opened.

1

There are 1 best solutions below

1
DileeprajnarayanThumula On

I have tried the following approach in Databricks Notebook using the content below:

1st Approach:

spark.conf.set("spark.databricks.service.server.enabled", "true") spark.conf.set("spark.databricks.service.port", "443")

Results:

print(spark.conf.get("spark.databricks.service.server.enabled"))
print(spark.conf.get("spark.databricks.service.port"))
true
443

2nd Approach:

In the cluster configuration page:

Edit > Advanced settings > Spark configuration

Enter image description here

After adding the Spark configuration, confirm and restart the cluster.