I'm trying to upload all files in a certain upload.dir to an SFTP Server using RCurl::ftpUpload.
Currently therefore I am using a for-loop, which unfortunately takes too much time - I think because it is opening and closing the connection for every file. Is there any way to do this faster?
Here is a code snippet:
files_for_upload <- list.files(upload.dir,
full.names=TRUE,
recursive=TRUE)
files_for_upload2 <- list.files(upload.dir,
full.names=FALSE,
recursive=TRUE)
for (i in 1:length(files_for_upload)) {
RCurl::ftpUpload(what=files_for_upload[i],
to=paste0("Directory_on_my_server/", files_for_upload2[i]),
userpwd="my_userid:my_password",
.opts=list(ftp.create.missing.dirs=TRUE, forbid.reuse=TRUE)
)
}
So far I already tried to parallelize the loop with the foreach package:
foreach::foreach(i=1:length(files_for_upload)) %dopar%
RCurl::ftpUpload(what=files_for_upload[i],
to=paste0("Directory_on_my_server/", files_for_upload2[i]),
userpwd="my_userid:my_password",
.opts=list(ftp.create.missing.dirs=TRUE, forbid.reuse=TRUE)
)
}
This however gives an error:
task 3 failed - "Failed to connect to My_Server port 22
after 21252 ms: Couldn't connect to server"
Is there any solution to speed things up? Like opening a connection and use it for all files? Or maybe upload a zip file and unzip it on the server using R?