I usually work on projects with microservices and I am fighting issues with performance using my local laptop with a docker center, for now, I have created a remote environment in AWS where I have my docker there and I am running all containers when making tests. I am using Nodemon to watch my file change and rebuild the artifact, but how it is happening remotely and my code change is local, I need a suggestion on how I can keep the files synced in both environments, LOCAL to REMOTE. I stay connected remotely using ssh. I have tried to mount the remote folder in my local device using sshfs but it is not working how I need. I continue with a trade-off. Please let me know what path I need to take.
ssh, remote file, docker
95 Views Asked by Cleriston Martins Cardoso AtThere are 2 best solutions below
On
I do this sort of test in three stages.
For the first stage, don't involve Docker at all. Use Node, Nodemon, and whatever other local tools to do development as normal. In your test suite, use mocks and similar techniques so that you can verify your application behavior without actually being able to connect to other containers. If you need to do manual testing in this environment, you can configure your application to talk to the remote server, or use a port-forward (ssh -L, kubectl port-forward).
For the second stage, docker build an image. Run that image locally, without any sort of volume mounts and without using Nodemon. You might again be able to configure the container to talk to the remote system, or you might be able to use a tool like Docker Compose to run the entire application stack locally. Run a set of integration tests against this local-but-containerized environment.
If that works, commit and push your branch to source control and have your CI system build an official image.
Now on the target system, you (or your CI system) need to change the image tag to what your CI built, and delete and recreate the container. Pulling the image will include the new code directly in the image. This is the same way you'll deploy the application, so this gives you pre-deployment environment to do full-system tests.
None of these steps involve Docker mounts; when Docker is involved, the code is exactly the code in the image. There is no filesystem synching or remote connections between machines. This same fundamental approach works with any language, even compiled languages where the running application doesn't usually include source code.
I've used lsyncd for this. It monitors your local file system for changes and then uses rsync via ssh to mirror the changes to a remote machine.
You can also create more complex config files that have file exclusions and tweak the response times and such.