You may use any tool to transfer files, such as compressed backup archives, from one location to another; however, using "wget" may not be a safe and desirable option for such a large amount of data (i.e., 48 GB) as stated in the opening post of this thread.
We would welcome it if you would like to elaborate on how you are suggesting the "wget" utility be used under the aforementioned circumstances; providing us with greater details will be very much appreciated and should help to ensure a better understanding.
As far as i believe 'wget' command is safe for downloads. There is no file size limit to download using wget.
Basic wget command syntax:
wget Link_of _download_file
example:
wget http://www.abc.com/abc.zip
Where Link_of _download_file is the actual url to the file you want to download. The url doesn't have to be in the form of http because wget can also download from ftp sources as well.
wget does NOT have a file size limit by itself. There is a chance that wget wont work for large files using ftp sources, The system where you download the file may have a file size limit ( run ulimit -a to see if there is a local file size limit ) , or the ftp server from where you're trying to download may have a file size limit
In case the download was interrupted for some reason. To attempt to pick it up where wget left off you would use the command:
wget -c Same_Link_of _download_file_above
example:
wget -c http://www.abc.com/abc.zip
Where http://www.abc.com/abc.zip is the exact link that was interrupted.
Regards!
Shrex