I have looked at the celery codebase to get an idea of implementing an in-built scheduled task but I am not sure whether that's the way to go about it for my case.
I have a third-party package (meant to be used in Django projects) that needs to have a scheduled in-built celery task, just as backend_cleanup is built in celery.
The developer using the package needs to be able to disable the task or update the crontab schedule using some Django settings.
This is what I have so far in the package code based on what I saw in the docs:
# tasks.py
app = Celery()
@shared_task
def do_stuff():
# do some stuff that will access the database
pass
@app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
okay_to_run = getattr(settings, 'OKAY_TO_RUN', False)
if okay_to_run:
default_schedule = crontab(0, 0, day_of_month='2')
schedule = getattr(settings, 'OKAY_TO_RUN_SCHEDULE', None)
if schedule is None:
sender.add_periodic_task(default_schedule, do_stuff.s())
else:
sender.add_periodic_task(crontab(**schedule), do_stuff.s())
I see celery detects the task but it does not schedule it. What could be wrong and is there a better approach to this?
UPDATE
I was probably overthinking this. I have just made the task a normal shared_task and left the scheduling for the developer who will be re-using the app.