The Community Forums

Interact with an entire community of cPanel & WHM users!
  1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Legacy cPanel backups sent to amazon S3 with 'auto-cleanup'

Discussion in 'Workarounds and Optimization' started by 3x-a-D3-u5, Oct 29, 2013.

  1. 3x-a-D3-u5

    3x-a-D3-u5 Member

    Joined:
    Nov 26, 2006
    Messages:
    11
    Likes Received:
    0
    Trophy Points:
    1
    Auto-cleanup for Amazon s3 buckets:
    A cool feature I found for cleaning up old forgotten accounts was the lifecycle rules. Mine deletes backups older than 60 days, so I'm never left with accounts that were terminated from the server but still costing me backup money in s3 buckets.

    In the propertis section of the bucket in amazon maganement console: "Lifecycle Rule - Create a lifecycle rule to schedule the archival of objects to Glacier and/or permanent removal of objects. "

    Legacy cPanel backups copied to Amazon S3
    wrote this script a while back, I've been using it for over a year now, but going to rewrite to take advantage of the new backup system. Currently this uses the legacy cPanel system. It should do the trick:

    Follow thos instructions carefully. Below is the script. There are plenty of comments and the code isn't to difficult to navigate. I set it up so I have 4 backups, 1 daily, 3 weekly (One 10 days old, one 20 days old and one 30 days old). Actually my comment makes more sense:

    31 days of rolling log files kept. When the counter reaches 1, 11 and 21 a backup is made.

    Install Tips:
    • chmod the file 755
    • Execute from Pre/Post Backup Script /scripts/postcpbackup (just add it to that file as path to this file)
    • This file is referenced as rwsbackup.1.2.sh so call it as that when you mk file

    I hope it works for you. I take No respocibility taken for what you do to your own server.

    Code:
    #!/bin/bash
    #####################################################
    #    RWS ROLLING AWS BACKUPs by 3x-a-D3-u5 (relies on http://timkay.com/aws/)
    #    Version 1.2 (September 2012)
    #
    #    Backup daily cpanel backups, and a rolling 10 backup with 3 versions.
    #    Transfer cpanel premade backups to Amazon S3
    #    Version 1.2 is less bandwidth intesive as it copies backups from within S3 to new files, instead of from the server to S3
    #    Also brags better logging and more detailed terminal running stats.
    #####################################################
    #   This program is distributed in the hope that it will be useful,
    #   but WITHOUT ANY WARRANTY; without even the implied warranty of
    #   MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
    #   GNU General Public License for more details.
    #
    #   You should see <http://www.gnu.org/licenses/>.
    #####################################################
    
    
    #shell function
    do_recent_backup()
    {
         bkpname=$1
               /usr/bin/s3put "$bucket"/"$user"-"$bkpname".tar.gz /backup/cpbackup/daily/"$user".tar.gz >> $logfile 2>&1
                echo -e `date '+%D %T'` "Copied "$bkpname" backup to "$user"/"$user"-"$bkpname".tar.gz" >> $logfile
            echo -e `date '+%D %T'` "Copied "$bkpname" backup to "$user"/"$user"-"$bkpname".tar.gz"
    }
    #usage
    #do_rws_backup $dailyname
    
    do_semester_backup()
    {
        segmentbkpname=$1
       
        /usr/bin/s3cp --progress /"$bucket"/"$user"-"$segmentbkpname".tar.gz /"$bucket"/"$user"-"$dailyname".tar.gz >> $logfile 2>&1
        echo -e `date '+%D %T'` "Copied "$segmentbkpname" backup to "$user"/"$user"-"$segmentbkpname".tar.gz" >> $logfile
        echo -e `date '+%D %T'` "Copied "$segmentbkpname" backup to "$user"/"$user"-"$segmentbkpname".tar.gz"
    }
    #usage
    #do_semester_backup $dailyname segment1
    #do_semester_backup segment2
    
    echo "RWS ROLLING AWS BACKUPs by 3x-a-D3-u5"
    echo "Version 1.2 (September 2012)"
    echo "Backup daily cpanel backups, and a rolling 10 backup with 3 versions. Transfer premade daily cpanel backups to Amazon S3"
    echo
    echo
    echo
    echo
    echo
    
    (. /home/tools/counterfile
    day=$(( $day + 1 ))
    if [ $day -eq 31 ]
    then day=0
    fi
    echo "day=$day" > /home/tools/counterfile
    . /home/tools/counterfile
    
    ################################################
    #3 Rolling backups made with 10 day intervals
    # Log files for all actions - seperate log files for each day, kept for 30 days
    #Define $logfile, $bucket
    bucket="your_bucket_name"
    logfile=/home/tools/logs/rwsbackup_day"$day".log
    #31 days of rolling log files kept. When the counter reaches 1, 11 and 21 a backup is made.
    ################################################
    
    
    (if [ -e "$logfile" ];
    then
      rm -f "$logfile"
    fi)
    
    
    
    echo "You may follow full logfile using tail -f " $logfile
    echo "Current run Cycle: " $day
    echo
    echo "Current run Cycle: " $day >> $logfile
    echo "Bucket to be used: " $bucket
    echo
    echo "Bucket to be used: " $bucket >> $logfile
    
    ####Used to determine total users, and current user number.
    usercount=0
    currentuser=0
    /usr/bin/find /var/cpanel/users -type f -printf "%f\n" |
    while read totalcount; do
    let "usercount += 1"
    done
    echo $usercount
    ####
    
    /usr/bin/find /var/cpanel/users -type f -printf "%f\n" |
    while read user; do
    let "currentuser += 1"
    echo `date '+%D %T'` "Current user is: "$user " ("$currentuser" of " $usercount ")" >> $logfile
    echo `date '+%D %T'` "Current user is: "$user " ("$currentuser" of " $usercount ")"
    
    #Send Daily Backup to S3
    #If you want to call it something other than daily. Only change this if you havent run it before, otherwise old files will not be overwritten.
    dailyname="daily"
    do_recent_backup $dailyname
    
    #Do backups rolling 10 day intervals
    (if [ $day -eq 1 ]; then      
    do_semester_backup segmant1
        elif [ $day -eq 11 ]; then  
    do_semester_backup segmant3
        elif [ $day -eq 21 ]; then    
    do_semester_backup segmant2
    fi)
    done
    #Email the log.  Replace nobody@nowhere.com with your email address.
    #$emailaddy="delta.rws@gmail.com"
    #$subjectofemail="S3 backup transfer complete"
    #$themessage="/home/tools/logs/rwsbackup_day"$day".log"
    #echo "$themessage" | mail -s "$subjectofemail" "$emailaddy"
    )
     
    #1 3x-a-D3-u5, Oct 29, 2013
    Last edited by a moderator: Apr 7, 2016
  2. cPanelMichael

    cPanelMichael Forums Analyst
    Staff Member

    Joined:
    Apr 11, 2011
    Messages:
    30,678
    Likes Received:
    654
    Trophy Points:
    113
    cPanel Access Level:
    Root Administrator
    Hello :)

    Thank you for sharing this workaround. Please note that user-submitted workarounds are not tested or supported by cPanel. We encourage everyone to review all aspects of workarounds before implementing them on a production server.

    Thank you.
     
  3. briansol

    briansol Active Member

    Joined:
    Oct 31, 2007
    Messages:
    35
    Likes Received:
    1
    Trophy Points:
    6
    Location:
    ct
    Does this other s3 lib conflict in any way with s3cmd?
     
Loading...

Share This Page