How to connect to Azure SQL Server in databricks using windows authentication

225 Views Asked by At

The server type is database engine and the authentication type used while connecting through SSMS is Windows authentication, similarly, I want to connect the same database through AZURE Databricks.

I have used this code for the connection

jdbcHostname = "#####" jdbcPort = "1433" jdbcDatabase = "####" jdbcUsername = "##########" jdbcPassword = "####"

jdbcUrl = f"jdbc:sqlserver://{jdbcHostname}:{jdbcPort};database={jdbcDatabase};integratedSecurity=true" connectionProperties = { "user": jdbcUsername,
"authenticationScheme": "NativeAuthentication", "driver": "com.microsoft.sqlserver.jdbc.SQLServerDriver" } jdbcDF = spark.read.jdbc(url=jdbcUrl, table="######",properties=connectionProperties)

But I am getting this error:

The TCP/IP connection to the host #####, port 1433 has failed. Error: "#####. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall

So, what should be the Python code for this connection?

SS of the SSMS login page SSMS login page

1

There are 1 best solutions below

1
Bhavani On
 The TCP/IP connection to the host C1ML63563-MS, port 1433 has failed. Error: "C1ML63563-MS. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.

The server may not be configured to accept TCP/IP, which could be the reason for the error above. To resolve the issue, enable TCP/IP for the SQL Server as follows:

Go to SQL Server Configuration Manager, expand SQL Server Network Configuration, and select the protocol. Enable TCP/IP by double-clicking on TCP/IP as shown below:

enter image description here

Additionally, go to SQL Native Client, select Client Protocols, double-click on TCP/IP, and set 1433 as the default port as shown below:

enter image description here

Restart the SQL server service. After that, as per this, use a Windows authenticated username and password with the JDBC driver to connect to the SQL database from a Databricks notebook. You can use the code below for the connection:

jdbcHostname = "<serverName>"
jdbcPort = 1433
jdbcDatabase = "<databaseName>"
jdbcUsername = "<WindowsuserName>"
jdbcPassword = "<password>"
jdbcDriver = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
table = "<tableName>"
jdbcUrl = f"jdbc:sqlserver://{jdbcHostname}:{jdbcPort};databaseName={jdbcDatabase}"
df = spark.read.format("jdbc").option("driver", jdbcDriver).option("url", jdbcUrl).option("dbtable", table).option("user", jdbcUsername).option("password", jdbcPassword).load()
df.show()

You will get the table data as output, as shown below:

enter image description here

For more information, you can refer to this.