The Community Forums

Interact with an entire community of cPanel & WHM users!
  1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Working custom backup script to Amazon S3?

Discussion in 'Data Protection' started by mattin, Jun 13, 2013.

  1. mattin

    mattin Member

    Joined:
    Feb 10, 2013
    Messages:
    11
    Likes Received:
    0
    Trophy Points:
    1
    Location:
    Bratislava, Slovakia
    cPanel Access Level:
    Root Administrator
    Twitter:
    Hello,

    is here anybody who have working custom backup script from new backup system to Amazon S3 and he's willing to share it? :)

    Thanks ;)
     
  2. cPanelMichael

    cPanelMichael Forums Analyst
    Staff Member

    Joined:
    Apr 11, 2011
    Messages:
    30,678
    Likes Received:
    653
    Trophy Points:
    113
    cPanel Access Level:
    Root Administrator
    Hello :)

    In case another user is not able to provide you with this specific example, you can find documentation on using a custom backup destination at:

    Custom Destinations for Backup Configuration

    An example custom destination can be found in:

    /usr/local/cpanel/scripts/custom_backup_destination.pl.skeleton

    Thank you.
     
  3. rpereyra

    rpereyra Member

    Joined:
    Apr 24, 2008
    Messages:
    18
    Likes Received:
    0
    Trophy Points:
    1
    Any working script somewhere ?

    roberto
     
  4. Evolve

    Evolve Well-Known Member

    Joined:
    Jan 31, 2007
    Messages:
    47
    Likes Received:
    0
    Trophy Points:
    6
    You're just a feature tease cPanel!

    Basically you're saying it's possible but you're not going to do the work for us?
     
  5. stevo81989

    stevo81989 Registered

    Joined:
    Sep 7, 2012
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    1
    cPanel Access Level:
    DataCenter Provider
    Lol, sounds like it. Im actually trying to come up with this solution on my own but I admit that my perl is not very strong. Anyone know if we can use python as well?
     
  6. cPanelMichael

    cPanelMichael Forums Analyst
    Staff Member

    Joined:
    Apr 11, 2011
    Messages:
    30,678
    Likes Received:
    653
    Trophy Points:
    113
    cPanel Access Level:
    Root Administrator
    There is an existing feature request for native support of Amazon S3 backups at:

    Amazon S3 integrated into WHM as a backup option

    I encourage you to vote for, and add your input to this feature request if you would like to see native support for this feature implemented.

    Thank you.
     
  7. briansol

    briansol Active Member

    Joined:
    Oct 31, 2007
    Messages:
    35
    Likes Received:
    1
    Trophy Points:
    6
    Location:
    ct
    this is hugely important and there's no good alternative even with paid add on scripts.

    please help us back up off site with s3!!!
     
  8. tgv

    tgv Member

    Joined:
    May 29, 2008
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    1
    In case someone decides to write their own custom destination script, cpanel's documentation is missing one thing -- that uploading function must also periodically write to STDOUT, otherwise it gets shut down as timed out. This is only important when handling large archives that take more than (max allowable) 300 seconds to copy.
     
  9. Kent Brockman

    Kent Brockman Well-Known Member

    Joined:
    Jan 20, 2008
    Messages:
    1,130
    Likes Received:
    3
    Trophy Points:
    38
    Location:
    Buenos Aires, Argentina
    cPanel Access Level:
    Root Administrator
    Twitter:
  10. mattin

    mattin Member

    Joined:
    Feb 10, 2013
    Messages:
    11
    Likes Received:
    0
    Trophy Points:
    1
    Location:
    Bratislava, Slovakia
    cPanel Access Level:
    Root Administrator
    Twitter:
  11. 3x-a-D3-u5

    3x-a-D3-u5 Member

    Joined:
    Nov 26, 2006
    Messages:
    11
    Likes Received:
    0
    Trophy Points:
    1
    wrote this a while back, I've been using it for over a year now, but going to rewrite to take advantage of the new backup system. Currently this uses the legacy cPanel system. It should do the trick:

    You need this perl library installed: aws - simple access to Amazon EC2 and S3

    Follow thos instructions carefully. Below is the script. There are plenty of comments and the code isn't to difficult to navigate. I set it up so I have 4 backups, 1 daily, 3 weekly (One 10 days old, one 20 days old and one 30 days old). Actually my comment makes more sense:

    31 days of rolling log files kept. When the counter reaches 1, 11 and 21 a backup is made.

    Code:
    #!/bin/bash
    #####################################################
    #	RWS ROLLING AWS BACKUPs by Paul Kresfelder (relies on http://timkay.com/aws/)
    #	Version 1.2 (September 2012)
    #
    #	Backup daily cpanel backups, and a rolling 10 backup with 3 versions. 
    #	Transfer cpanel premade backups to Amazon S3
    #	Version 1.2 is less bandwidth intesive as it copies backups from within S3 to new files, instead of from the server to S3
    #	Also brags better logging and more detailed terminal running stats.
    #####################################################
    #   This program is distributed in the hope that it will be useful,
    #   but WITHOUT ANY WARRANTY; without even the implied warranty of
    #   MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
    #   GNU General Public License for more details.
    #
    #   You should see <http://www.gnu.org/licenses/>.
    #####################################################
    
    
    #shell function
    do_recent_backup()
    {
         bkpname=$1
               /usr/bin/s3put "$bucket"/"$user"-"$bkpname".tar.gz /backup/cpbackup/daily/"$user".tar.gz >> $logfile 2>&1
                echo -e `date '+%D %T'` "Copied "$bkpname" backup to "$user"/"$user"-"$bkpname".tar.gz" >> $logfile
    	    echo -e `date '+%D %T'` "Copied "$bkpname" backup to "$user"/"$user"-"$bkpname".tar.gz"
    }
    #usage
    #do_rws_backup $dailyname
    
    do_semester_backup()
    {
    	segmentbkpname=$1
        
    	/usr/bin/s3cp --progress /"$bucket"/"$user"-"$segmentbkpname".tar.gz /"$bucket"/"$user"-"$dailyname".tar.gz >> $logfile 2>&1
    	echo -e `date '+%D %T'` "Copied "$segmentbkpname" backup to "$user"/"$user"-"$segmentbkpname".tar.gz" >> $logfile
    	echo -e `date '+%D %T'` "Copied "$segmentbkpname" backup to "$user"/"$user"-"$segmentbkpname".tar.gz"
    }
    #usage
    #do_semester_backup $dailyname segment1
    #do_semester_backup segment2
    
    echo "RWS ROLLING AWS BACKUPs by Paul Kresfelder"
    echo "Version 1.2 (September 2012)"
    echo "Backup daily cpanel backups, and a rolling 10 backup with 3 versions. Transfer premade daily cpanel backups to Amazon S3"
    echo
    echo
    echo
    echo
    echo
    
    (. /home/tools/counterfile
    day=$(( $day + 1 ))
    if [ $day -eq 31 ]
    then day=0
    fi
    echo "day=$day" > /home/tools/counterfile
    . /home/tools/counterfile
    
    ################################################
    #3 Rolling backups made with 10 day intervals
    # Log files for all actions - seperate log files for each day, kept for 30 days
    #Define $logfile, $bucket
    bucket="your_bucket_name"
    logfile=/home/tools/logs/rwsbackup_day"$day".log
    #31 days of rolling log files kept. When the counter reaches 1, 11 and 21 a backup is made.
    ################################################
    
    
    (if [ -e "$logfile" ];
    then
      rm -f "$logfile"
    fi)
    
    
    
    echo "You may follow full logfile using tail -f " $logfile
    echo "Current run Cycle: " $day
    echo
    echo "Current run Cycle: " $day >> $logfile
    echo "Bucket to be used: " $bucket
    echo
    echo "Bucket to be used: " $bucket >> $logfile
    
    ####Used to determine total users, and current user number.
    usercount=0
    currentuser=0
    /usr/bin/find /var/cpanel/users -type f -printf "%f\n" |
    while read totalcount; do
    let "usercount += 1"
    done
    echo $usercount
    ####
    
    /usr/bin/find /var/cpanel/users -type f -printf "%f\n" |
    while read user; do
    let "currentuser += 1"
    echo `date '+%D %T'` "Current user is: "$user " ("$currentuser" of " $usercount ")" >> $logfile
    echo `date '+%D %T'` "Current user is: "$user " ("$currentuser" of " $usercount ")"
    
    #Send Daily Backup to S3
    #If you want to call it something other than daily. Only change this if you havent run it before, otherwise old files will not be overwritten.
    dailyname="daily"
    do_recent_backup $dailyname
    
    #Do backups rolling 10 day intervals
    (if [ $day -eq 1 ]; then       
    do_semester_backup segmant1
    	elif [ $day -eq 11 ]; then   
    do_semester_backup segmant3
    	elif [ $day -eq 21 ]; then     
    do_semester_backup segmant2
    fi)
    done
    #Email the log.  Replace nobody@nowhere.com with your email address.
    #$emailaddy="delta.rws@gmail.com"
    #$subjectofemail="S3 backup transfer complete"
    #$themessage="/home/tools/logs/rwsbackup_day"$day".log"
    #echo "$themessage" | mail -s "$subjectofemail" "$emailaddy"
    )
    
    Tips:
    chmod the file 755
    Execute Pre/Post Backup Script /scripts/postcpbackup (just add the file path to this file)

    I hope it works for you. I take No respocibility taken for what you do to your own server.
     
    #11 3x-a-D3-u5, Oct 28, 2013
    Last edited: Oct 28, 2013
  12. 3x-a-D3-u5

    3x-a-D3-u5 Member

    Joined:
    Nov 26, 2006
    Messages:
    11
    Likes Received:
    0
    Trophy Points:
    1
    Oh yes a cool feature I found for cleaning up old forgotten accounts was the lifecycle rules. Mine deletes backups older than 60 days, so I'm never left with accounts that were terminated from the server but still costing me backup money in s3 buckets.

    In the propertis section of the bucket in amazon maganement console: "Lifecycle Rule - Create a lifecycle rule to schedule the archival of objects to Glacier and/or permanent removal of objects. "
     
  13. Infopro

    Infopro cPanel Sr. Product Evangelist
    Staff Member

    Joined:
    May 20, 2003
    Messages:
    14,450
    Likes Received:
    195
    Trophy Points:
    63
    Location:
    Pennsylvania
    cPanel Access Level:
    Root Administrator
    Twitter:
    I'm thinking we might move your posts out from here and into the Workarounds forum, thoughts?
    Workarounds
     
  14. 3x-a-D3-u5

    3x-a-D3-u5 Member

    Joined:
    Nov 26, 2006
    Messages:
    11
    Likes Received:
    0
    Trophy Points:
    1
Loading...

Share This Page