I am copying a large (19Gb) file from the internet to Google Cloud storage using the method described here and here (I use GC shell https://cloud.google.com/shell). So far so good, but it is expected to take 11+ hours. I wonder if there is an alternative option for transferring such large files that includes the possibility of resuming the transfer if something happens.
yulia_v@cloudshell:~$ curl -L https://archive.org/a_file.7z | gsutil cp - gs://a_bucket/a_file.7z
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
Copying from <STDIN>...
1 19.0G 1 237M 0 0 483k 0 11:28:22 0:08:23 11:19:59 351k/ [0 files][100.0 MiB/ 0.0 B] 0.0 B/s