I have two databases: a local PostgreSQL database and a cloud database that uses HTTP requests. I have implemented two Python classes that use the same interface to interact with these databases.
I want to enable simultaneous operation of these classes. The goal is to store data in the local PostgreSQL database if the cloud database is unavailable, and then synchronize the data with the cloud once it becomes available.
To achieve this, I'm considering creating an additional class that follows the same interface and incorporates the following logic for simultaneous operation:
- Use a function from the PostgreSQL class to handle requests.
- Store the request information in a PostgreSQL table named "sync_with_cloud."
- Return the response.
- In the background, use a function from the cloud class to handle the same requests.
- If the function from the cloud class completes successfully, update the "status" field in the "sync_with_cloud" table to "success." Otherwise, set it to "fail."
By implementing this approach, I can leverage both databases concurrently. The PostgreSQL database will provide immediate results, while the cloud database operation runs asynchronously in the background. Additionally, I can query the "sync_with_cloud" table to check if the synchronization with the cloud has failed.
Are there any best practices for achieving simultaneous work between these databases?
Thank you in advance.