I need to connect with an external API and query this API for relevant data. The API may have millions of relevant data records available that need to be retrieved in batches of 500 records. The stack is PHP Laravel and the application is deployed on Laravel Vapor Serverless architecture.
AWS SQS background jobs have a hard limit of 15 minutes execution time. In order to circumvent this limit, I check before each paginated data request if total background job execution time exceeded 13 minutes and if yes I launch a new background job that continues data retrieval where the previous job stopped. The new job is launched using a Laravel event and listener.
My question is: is this a correct approach or are there better ways to solve this with the stack I use?