How do I prevent git from timing out when pushing huge amounts of data?

975 Views Asked by At

Is there a way to push huge amounts of data to github? I have a massive collection of files I am trying to backup, but when I try to push them, git will timeout or stop working in the middle of it.

Current solution: Commit and push small portions individually (At most 1 to 2 Gigabytes at a time)

Additional Information:

  • It is a private repository being accessed using ssh.
  • I am using git gui and git bash
  • Attempting to backup 30 GB of data

What I want to do: Commit all changes, and push all data overnight. (Approx: 6hrs)

1

There are 1 best solutions below

0
VonC On

Git is not well suited for such a huge amount of data, and most of said data would not benefit from delta/compression of any kind (because of their binary nature).

A proper tool like rclone is a possible right tool for the job, since it can:

  • Backup (and encrypt) files to cloud storage
  • Restore (and decrypt) files from cloud storage
  • Mirror cloud data to other cloud services or locally
  • Migrate data to the cloud, or between cloud storage vendors
  • Mount multiple, encrypted, cached or diverse cloud storage as a disk

Instead of using a Git repository, you would use a Cloud backup service.