Airflow config for running concurrent DAG tasks

27 Views Asked by At

I have multiple DAGs on my airflow env some of which has to run 32 tasks concurrently. Other DAGS are being queued while these tasks are being executed. I have ec2 instances and 16 cpu for worker and 8cpu for scheduler. I have set parallelism and dag_concurrency and worker_concurrency to 64. I am not sure if this an ideal way. I am able to run more than 32 tasks from different DAGs but now the DAG is in running state but the tasks are set to up_for_retry status and either run or fails after some time. Is it better to setup multiple workers with lower cpu or should any other parameters be changed to ensure a seamless flow?

0

There are 0 best solutions below