I am new to ADF. I am trying to create a ADF pipeline. I need to run a query on delta tables and send the output csv file to sftp. I should not use Databricks connection. Is that possible ??
I tried using Azure databricks connection. It’s costlier operation my manager says. He asked me to try with out databricks connection.
If you want to send the delta table data to SFTP using ADF, then the connection to databricks is necessary. Without the databricks connection, ADF cannot copy the query data to the target.
If you are sending the delta table data to SFTP through the code itself rather than ADF, then you can try the below workaround.
Put all the code in a notebook and create job for that notebook.
Use the Job Id and execute this job using REST API POST request.
Use web activity like below to do this.
Web activity will execute the job and returns the
run_idof the job.Notebook execution: