I have a Workflow implemented on GCP that triggers a containerized batch job. (Here is the tutorial for something similar: https://cloud.google.com/workflows/docs/tutorials/batch-and-workflows). The service account that runs this job has the "Cloud SQL Client" role.
Still, I keep on getting this error when trying to connect to the database (in python using psycopg2):
psycopg2.OperationalError: connection to server on socket "/cloudsql/PROJECT_NAME:us-central1:DB_NAME/.s.PGSQL.5432" failed: No such file or directory
Is the server running locally and accepting connections on that socket?
Why?
Jonathan from the Cloud SQL Connector team here. There are two ways to go about this.
First option: Configure your application to connect to your Cloud SQL database instance using TCP and the Cloud SQL Connector for Python. Python connector example. This uses the
pg8000database driver. Unfortunately, the connector does not work with thepsycopg2library.Second option: Add the Cloud SQL Auth Proxy container to your batch job.
This is a slightly modified job workflow definition from the Run a Batch Job guide:
Replace <INSTANCE_CONNECTION_NAME> with the connection string from your database, something like
my-project:uscentral-1:instanceReplace <DB_PORT> with a TCP port appropriate for your database. The proxy container will start listening on this port when it starts. Your application should use it's database driver to connect to
localhost:<DB_PORT>.In this configuration, your application will still need to authenticate to the database using database username and password. If you want to use IAM Authentication, you will need to adapt this based on the Cloud SQL Proxy Guide.