I'm using Airflow with AWS.
I constructed airflow service with AWS ECS. When I deployed to staging environment and monitored, I could check storage read/write bytes per second are increasing by time like below. (and it looks like linear)

There is a simple task which refreshes Postgres materialized view and it runs at every 31 seconds. (Also there are several tasks which executes some sql queries. But I think those are not causing storage read/write increase)
Using docker image is public.ecr.aws/bitnami/airflow:2.5.3.
I built an ECS task with 4 containers - webserver, scheduler, celery worker, celery flower - which uses the same image. I didn't use managed airflow in AWS. If I trigger code deploy(blue/green) to run new containers, storage read/write goes down to normal.
Also, I set AWS S3 to save logs.
I cannot find the reason why storage usage is increased by time :( Please help.
(I also can't believe that airflow image can write 8 giga bytes per second. I used image that is not very large)