huge site migration - urgent help needed

Shrex

Member
Apr 28, 2010
5
0
51
Mangalore
You may use any tool to transfer files, such as compressed backup archives, from one location to another; however, using "wget" may not be a safe and desirable option for such a large amount of data (i.e., 48 GB) as stated in the opening post of this thread.

We would welcome it if you would like to elaborate on how you are suggesting the "wget" utility be used under the aforementioned circumstances; providing us with greater details will be very much appreciated and should help to ensure a better understanding.

As far as i believe 'wget' command is safe for downloads. There is no file size limit to download using wget.

Basic wget command syntax:

wget Link_of _download_file

example:
wget http://www.abc.com/abc.zip

Where Link_of _download_file is the actual url to the file you want to download. The url doesn't have to be in the form of http because wget can also download from ftp sources as well.

wget does NOT have a file size limit by itself. There is a chance that wget wont work for large files using ftp sources, The system where you download the file may have a file size limit ( run ulimit -a to see if there is a local file size limit ) , or the ftp server from where you're trying to download may have a file size limit

In case the download was interrupted for some reason. To attempt to pick it up where wget left off you would use the command:

wget -c Same_Link_of _download_file_above

example:
wget -c http://www.abc.com/abc.zip


Where http://www.abc.com/abc.zip is the exact link that was interrupted.

Regards!
Shrex
 

cPanelDon

cPanel Quality Assurance Analyst
Staff member
Nov 5, 2008
2,544
13
268
Houston, Texas, U.S.A.
cPanel Access Level
DataCenter Provider
As far as i believe 'wget' command is safe for downloads. There is no file size limit to download using wget.

Basic wget command syntax:

wget Link_of _download_file

example:
wget http://www.abc.com/abc.zip

Where Link_of _download_file is the actual url to the file you want to download. The url doesn't have to be in the form of http because wget can also download from ftp sources as well.

wget does NOT have a file size limit by itself. There is a chance that wget wont work for large files using ftp sources, The system where you download the file may have a file size limit ( run ulimit -a to see if there is a local file size limit ) , or the ftp server from where you're trying to download may have a file size limit

In case the download was interrupted for some reason. To attempt to pick it up where wget left off you would use the command:

wget -c Same_Link_of _download_file_above

example:
wget -c http://www.abc.com/abc.zip


Where http://www.abc.com/abc.zip is the exact link that was interrupted.

Regards!
Shrex
That is correct, the utility "wget" may be used for any file download; however, how does that improve upon other solutions that were previously discussed in this thread such as using "rsync" via SSH?
 

Shrex

Member
Apr 28, 2010
5
0
51
Mangalore
That is correct, the utility "wget" may be used for any file download; however, how does that improve upon other solutions that were previously discussed in this thread such as using "rsync" via SSH?
I am not sure about the limitations of "rsync" over "wget". But i feel "rsync" is ideal to transfer limited size files, or else to maintain image of server or backup of the server. But "wget" is the best way to transfer huge files and its fast too.

Regards!
Shrex:)
 

Miraenda

Well-Known Member
Jul 28, 2004
243
5
168
Coralville, Iowa USA
cPanel Access Level
Root Administrator
I would suggest doing the following. First, on the new machine, put a file in /tmp called excludelist and put into that file the files or folders you want excluded in the rsync. These files and folders are the ones you already have the large .tar.gz file setup for. So create the file /tmp/excludelist and put into that file a list of the the files and folders you want excluded (one per line) such as this:

Code:
file.tar.gz
file2.tar.gz
public_html/folder1
public_html/folder2
Anything within the folder will be excluded, so you only need the path to that folder. Of note, the path is from /home/username as that's the starting directory you'll be using, so it isn't the full path to that file or folder (so you do not want /home/username in the path).

At that point, then issue in screen the following command on the new server:

Code:
rsync --archive --verbose --rsh=ssh --exclude-from=/tmp/excludelist [email protected]:/home/username/ /home/username/
For the above, replace oldserverIP with the old server's IP. Replace username with the cPanel username. After you've copied this over, you can then scp, rsync or whatever the .tar.gz files and then extract those files on the new machine.
 
Last edited:

Lyttek

Well-Known Member
Jan 2, 2004
775
5
168
I'd also suggest taking the advice given. If you're not familiar with rsync, get familiar with it... it's a fantastic tool! Given my experience, I'd trust rsync with my large files more than wget.