Recently we faced an issues that we wanted to transfer very large single file with almost 1Gb size and our internet was very unstable with almost 40KBps speed which normally could take almost 3-4 hours without any disconnection of internet if we decided to do scp or rsync. But due to unstable internet we couldn’t do either scp or rsync.
The next options could be to split the large image into smaller chunks, transfer those chunks to server and then regroup those to single file. But since the complexities involved in this method, we avoided this since we had ssh access to server, so the last option remains is to mount the remote folder to local machine and do “wget -c” from local PC to locally mounted remote folder.
This can be done like below,
$ sudo apt-get install sshfs
check what is the current user and add this user to sshfs
$ whoami $ sudo adduser my_username fuse
Now make the local directory for mounting the remote folder as,
$ mkdir my_localdirectory
Mount the remote directory, using sshfs as,
$ sshfs remote_username@remote_host_ip_or_domainname:/remote_direcory_path/remote_direcory my_localdirectory_path/my_localdirectory
Now, we have mounted remote directory from server to local directory, which we can confirm using mount command as,
$ mount | grep sshfs remote_username@remote_host_ip_or_domainname:/remote_direcory_path/remote_direcory on my_localdirectory_path/my_localdirectory type fuse.sshfs (rw,nosuid,nodev,user=my_username)
Now, we will use wget with “-c” option to copy our large file to server as below,
$ cp largefile.img /var/www/html $ cd my_localdirectory $ wget -c http://my_pc's_local_ip/largefile.img
That’s it, now even if your internet is unstable and disconnects you can use above “wget -c” command to resume from where it has left copying.
Once, the “wget” command completes, you will need to unmount the remote folder as,
$ cd "out_from_my_localdirectory" $ fusermount -u my_localdirectory