I am thinking of a way to trigger Airflow tasks when user uploads ANY file to the DAG directory, which Airflow operator may help in this case?
I know about the FileSensor operator, but as far as I understand it expects for a specific file to be uploaded to the directory.
If Airflow doesn't have such operators, what should I try in this case?
You can create a folder for uploaded files (for example 'user_upload/'). After that, get the all files under this folder and the time of modification or creation of files by os.path.getmtime or os.path.getctime. After that, you can compare these times with template variables like data_interval_start or ts and take the new files. If there are no new files, you can skip the downstream tasks using AirflowSkipException.