Is it possible to trigger a glue job(pyspark) from another glue job(pyspark) using boto3.
Everything seems to be working fine(no syntax or code errors) except the boto3 method
glue_client.start_job_run()
Tested the similar code in Lambda and it's working fine but In my case I need to use it from glue.
My Glue jobs need to exchange some parameters, so I'm trying to start the other job using the above method while passing the required parameters as arguments to the next job.
I even tried the Glue workflows, Sent the required parameters via Lambda to the workflow. But the jobs in the workflow are not able to read the workflow parameters.
glue_client.get_workflow_run_properties(Name=workflow_name, RunId=workflow_run_id)["RunProperties"]
My questions is, Am I using the boto3 glue client wrong. Is it supposed to be used only in python shell jobs and not with spark jobs??
Code snippet of starting a glue job from another glue job
# Trigger audit glue job
# Declare boto3 glue client
glue_client = boto3.client('glue')
inputParams = {
"--JobName" : args["JOB_NAME"],
"--JobId" : args['JOB_RUN_ID'],
"--redshift_database" : redshift_database
"--redshift_table_schema" : redshift_table_schema
"--TableName" : redshift_table,
"--Last_executed_time" : start_time,
"--No_of_souce_records" : count_of_souce_records,
"--No_of_target_records" : count_of_target_records,
"--Records_updated" : count_of_updated_records
}
print("Triggering audit glue job")
audit_job = glue_client.start_job_run(JobName = 'audit_populate', Arguments = inputParams)