Is there a way to push huge amounts of data to github? I have a massive collection of files I am trying to backup, but when I try to push them, git will timeout or stop working in the middle of it.
Current solution: Commit and push small portions individually (At most 1 to 2 Gigabytes at a time)
Additional Information:
- It is a private repository being accessed using ssh.
- I am using git gui and git bash
- Attempting to backup 30 GB of data
What I want to do: Commit all changes, and push all data overnight. (Approx: 6hrs)
Git is not well suited for such a huge amount of data, and most of said data would not benefit from delta/compression of any kind (because of their binary nature).
A proper tool like rclone is a possible right tool for the job, since it can:
Instead of using a Git repository, you would use a Cloud backup service.