We have a customer which connects to our servers through Satellite. However, they have a major concern that the facilities in which they want to run our applications, at many times have connection issues. So, they want to run our applications locally (on specific PC's). Data from the host system would feed to the local machines continuously. If the connection is lost, the PC still have enough data to conduct it's business until the connection is restored. At that time, the data changes from the PC are reflected back to the host system and visa versa.
I guess we would be considering some type of replication (this is all new to me). This has many questions but here are the main ones.
If we replicate, then they need a copy of SQL Server on each PC. We are talking about 60 sites which would be very expensive due to licensing. Also, other support costs.
Is it better to always run replication or only in the event that the connection was lost?
How does the local system get in sync with the hosted system?
Just looking for a better/less expensive solution.
The way I see it, there are two ways to for it (depending on your requirements.
If you think the problem will not persist you can use the circuit breaker pattern: https://learn.microsoft.com/en-us/azure/architecture/patterns/circuit-breaker
If you need to retry indefinitely and you can't afford to lose data then you will need a custom solution.
On a totally local environment you could go with either a local database like sql lite, where you can store items and retry if not successful, or store the calls in Microsoft Queue. Then you call build a service that reads the database or the queue and retries.