The Community Forums

Interact with an entire community of cPanel & WHM users!
  1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Backup, that is running more than 12 hours

Discussion in 'General Discussion' started by anton_latvia, Feb 20, 2007.

  1. anton_latvia

    anton_latvia Well-Known Member
    PartnerNOC

    Joined:
    May 11, 2004
    Messages:
    348
    Likes Received:
    3
    Trophy Points:
    18
    Location:
    Latvia
    cPanel Access Level:
    Root Administrator
    Hello friends-hosters. ;)
    We are widely using Cpanel and are more or less happy. But as I do all dirty job - backups, server monitoring, mail queues and so on - I've got little problem. This has been like this for a long time, but finally I've decided to discuss it with other member of the community. Enough with nice words, let's turn to the problem with our face. Backup.
    We do use default cpanel backup. Each server has second SATA disk for backup, we run it 3 times per week, we do not use incremental backups, at the end we get tar.gz files for each user. The other night cronjob is copying tar.gz files to dedicated backup server. But before that there is a problem with making this tar.gz archive. If user has large number of files, single account backup might take up to 20-30 minutes, even if size in megabytes is not that large. Quite often there could be 10-20 thousands of emails on default mail account. Approx. every second month I have to go through most loaded servers and check for such problematic accounts and simply clear their mailboxes.
    But how long would I be able to do like this? We are intelligent people here, so manual work is not for us. I am sure there is a way to optimize this routine. I have tried to add
    Code:
    export GZIP="--rsyncable"
    to crontab on one of the servers, but this did not help. Then I looked through cpbackup and pkgaccount scripts and as conclusion I thought there is nothing much that could be changed if the whole algorithm remains. So here is the task:
    We want to run backup and have tar.gz file at the end, that we could restore later with standard cpanel tools. But we want to copy only changed files and not copy the same files each second day if they were not changed. How do you think, would it be possible not to use copy method to grap users' home folder, but use rsync? And as second option - after making archive, do not delete temporary folder so when backup is being executed next time there would be no need to copy all files over again?

    Anton.

    P.S.: sorry, if this has already been solved and I haven't managed to find it here, just push me to the right side. Many thanks for understanding!
     
  2. yapluka

    yapluka Well-Known Member

    Joined:
    Dec 24, 2003
    Messages:
    301
    Likes Received:
    1
    Trophy Points:
    18
    Location:
    France
    cPanel Access Level:
    Root Administrator
    Hi Anton,

    What I'm doing on our servers is :

    1) Set WHM incremental backup
    2) Use the postcpbackup option to run a script when the backup is complete. This script will tar.gz all folders in /cpbackup/daily/ then send them by ftp to our remote backup server.

    BTW, the incremental backups can be restored just fine :)
     
  3. anton_latvia

    anton_latvia Well-Known Member
    PartnerNOC

    Joined:
    May 11, 2004
    Messages:
    348
    Likes Received:
    3
    Trophy Points:
    18
    Location:
    Latvia
    cPanel Access Level:
    Root Administrator
    Marie, where do you set "postcpbackup" value? And could you post your script that archives everything? :rolleyes:

    So far I've discovered, that by default cpanel backup script is making temporary folder inside /home :eek: Hard to believe really. I am considering to force it to use our second disk for this purpose.

    Anton.
     
  4. yapluka

    yapluka Well-Known Member

    Joined:
    Dec 24, 2003
    Messages:
    301
    Likes Received:
    1
    Trophy Points:
    18
    Location:
    France
    cPanel Access Level:
    Root Administrator
    I'm using Rampage's FTP Backup script (http://www.webhostgear.com/index.php?art/id:174) where I added a compression part. I also added a part to split files bigger than 2gb.

    To run this script right after the WHM backup is complete, create the file /scripts/postcpbackup if it doesn't already exist, chmod it 700 and paste the following :

    Code:
    #!/bin/bash
    
    #
    ##
    ### FTP Backup by WebHostGear.com
    ### Author: Steven Leggett AKA Ramprage
    ### Copyright 2004 WebHostGear.com and Steven Leggett      
    ### Contact? info@webhostgear.com
    ### 
    ##
    #
    # FTP Backup script created by Ramprage of WebhostGear.com
    # Copies all files from Cpanels daily backups to remote FTP server
    #
    # WARNING: 
    # This script is the intellectual property of Steven Leggett and may not be 
    # copied, redistributed or sold without explicit written permission from the author. 
    # We provide NO WARRANTY for this script. Use at your own risk
    
    
    version=1.0
    
    ##### INSTALL INSTRUCTIONS: STEP 1 #####
    ##### START ENTER YOUR INFO HERE #####
    
    serverip=FTPIP
    # Your remote servers IP address
    # EG: serverip=192.168.1.1
    
    serveruser=FTPUSERNAME
    # The FTP login for the remote server
    # EG: serveruser=bob
    
    serverpass=FTPPASSWD
    # The FTP password for the remote server
    # EG: serverpass=mypassword
    
    localdir=/backup/cpbackup/daily
    # WHERE LOCAL FILES ARE TO BACKUP
    # NO TRAILING SLASH
    # EG: localdir=/backup/cpbackup/daily
    
    
    remotedir=/
    # FTP directory where you want to save files to
    # This directory must exist on the FTP server!
    # NO TRAILING SLASH
    # EG: remotedir=/serverdirectory
    
    
    ##### END YOUR INFO HERE #####
    
    
    ##### INSTALL INSTRUCTIONS: STEP 2 #####
    # CHMOD the script to 755: # chmod 755 ftpbackup.sh
    
    # Add the script to a scheduled cron job to run as often as you like
    
    # In SSH do crontab -e, then paste in the following
    # 0 3 * * 1 /root/ftpbackup.sh
    # This does a FTP backup every second day of the week, lookup cronjobs for more info on setting dates and times.
    
    
    ##### INSTALL COMPLETE #####
    # DO NOT MODIFY ANYTHING BELOW #
    
    
    bakdate=`date`
    host=`hostname`
    cd $localdir
    
    echo "Starting FTP Backup on " $host $bakdate
    
    
    
    # Check file size and split big files
    sizemax="1500000000"
    for user in *
    do
     
      	tar cfz $user.tar.gz $user	
    	filesize=`ls -al "$user.tar.gz" | awk '{print $5}'`
    	
    	if [ "$filesize" -gt "$sizemax" ]; then
    	split -b 1400m "$user.tar.gz" "$user.tar.gz".
    		
    #FTP the file to the backup directory on the backup server
    /usr/bin/ftp -in <<END
    open $serverip
    user $serveruser $serverpass
    cd $remotedir
    bin
    verbose
    mput $user.tar.gz.*
    stat
    bye
    END
    rm -f "$user".tar.gz.*
    	
    else
    
    #FTP the file to the backup directory on the backup server
    /usr/bin/ftp -in <<END
    open $serverip
    user $serveruser $serverpass
    cd $remotedir
    bin
    verbose
    put $user.tar.gz
    stat
    bye
    END
    fi
    
    done
    
    rm -f "$localdir"/*.tar.gz
    echo "Ftp backup complete on " $bakdate
    exit 0
    
    In this case, I just delete the archive files but you also could store them to another folder.
     
    anton_latvia likes this.
Loading...

Share This Page