Airflow Dag run marked as success although the tasks didn't run

3.6k Views Asked by At

Recently, we have been getting some errors on airflow where certain dags will not run any tasks but are being marked as complete. We had the start_date using days_ago from airflow.

from airflow.utils.dates import days_ago

enter image description here

2

There are 2 best solutions below

1
Shivangi Singh On

From: https://forum.astronomer.io/t/dag-run-marked-as-success-but-no-tasks-even-started/1423

If you see dag runs that are marked as success but don’t have any task runs, this means the dag runs’ execution_date was earlier than the dag’s start_date.

This is most commonly seen when the start_date is set to some dynamic value e.g. airflow.utils.dates.days_ago(0). This creates the opportunity for the execution date of a delayed dag execution to be before what the dag now thinks is it’s start_date. This can even happen in a cyclic pattern, where a few dagruns will work, and then at the beginning of every day a dagrun will experience this problem.

This simplest way to avoid this problem is the never use dynamic start_date. It is always better to specify a static start_date. If you are concerned about accidentally triggering multiple runs of the same dag, just set catchup=False.

0
enrique.tuya On

There is an open ticket in Airflow project with this issue: https://github.com/apache/airflow/issues/17977