Airflow: How to trigger a dag multiple times and queuing the DagRuns for running one at a time

862 Views Asked by At

I have a job that i want airflow to do, but this job cannot run in parralel. This job is based on a docker image that airflow will trigger using KubernetesPodOperator with the specific command. I want to create a DAG that will do this job once at a time, but this DAG can be trigger multiples time while an instance of DagRuns is running. So my question is : There is a way to trigger a DAG, so that each DagRuns is queued somewhere and the DAG is listening to that queue and run when fetching from that queue. So that no Trigger will be ommited.

Thanks

1

There are 1 best solutions below

2
ozs On

Airflow pools can be used to limit the execution parallelism on arbitrary sets of tasks. The list of pools is managed in the UI (Menu -> Admin -> Pools) by giving the pools a name and assigning it a number of worker slots.

for more details see this link