I need to chain several jobs. Some have to be started right after other have finished and some need results of other jobs as an input.
Seems that I can start one job after the other by using sensors. AI suggests using @solid and @pipeline decorators, but I was unable to find suitable example of their usage in Dagster documentation or the internet. I can't figure out how to pass output from one job to the other. job_3(job_2()) call doesn't seem like a Dagster approach, isn't it?
Here is the code to illustrate an issue:
@job
def job_1():
save_to_db_op(
make_api_call_op()
)
@job
def job_2():
out_1, out_2 = process_data_op(
make_another_api_call_op()
)
save_to_db_op(out_1)
return out_2 # I need to pass it to another job
@job
def job_3(out_2): # how to pass input here?
process_op(out_2)
do_some_other_staff_op()
# this function is a pseudocode to represent what I want to recreate in Dagster
def figure_it_out_pipeline():
job_1() # wait until complete
job_3(
job_2
)