Working custom backup script to Amazon S3?

cPanelMichael

Administrator
Staff member
Apr 11, 2011
47,880
2,258
463
Hello :)

In case another user is not able to provide you with this specific example, you can find documentation on using a custom backup destination at:

Custom Destinations for Backup Configuration

An example custom destination can be found in:

/usr/local/cpanel/scripts/custom_backup_destination.pl.skeleton

Thank you.
 

Evolve

Well-Known Member
Jan 31, 2007
47
0
156
Why wasn’t Amazon S3 support added by default?
We added support for things that we were able to fit into our development time constraints. We have documentation on how to add support for things like S3 in the mean time here: Custom Destinations for Backup Configuration.
You're just a feature tease cPanel!

Basically you're saying it's possible but you're not going to do the work for us?
 

stevo81989

Registered
Sep 7, 2012
3
0
1
cPanel Access Level
DataCenter Provider
Lol, sounds like it. Im actually trying to come up with this solution on my own but I admit that my perl is not very strong. Anyone know if we can use python as well?
 

cPanelMichael

Administrator
Staff member
Apr 11, 2011
47,880
2,258
463
There is an existing feature request for native support of Amazon S3 backups at:

Amazon S3 integrated into WHM as a backup option

I encourage you to vote for, and add your input to this feature request if you would like to see native support for this feature implemented.

Thank you.
 

briansol

Well-Known Member
Oct 31, 2007
46
2
58
ct
this is hugely important and there's no good alternative even with paid add on scripts.

please help us back up off site with s3!!!
 

tgv

Member
May 29, 2008
9
0
51
In case someone decides to write their own custom destination script, cpanel's documentation is missing one thing -- that uploading function must also periodically write to STDOUT, otherwise it gets shut down as timed out. This is only important when handling large archives that take more than (max allowable) 300 seconds to copy.
 

3x-a-D3-u5

Member
Nov 26, 2006
11
0
151
wrote this a while back, I've been using it for over a year now, but going to rewrite to take advantage of the new backup system. Currently this uses the legacy cPanel system. It should do the trick:

You need this perl library installed: aws - simple access to Amazon EC2 and S3

To use aws, follow these steps:

Install curl if necessary (either apt-get install curl or yum install curl)
Download aws to your computer

curl https://raw.github.com/timkay/aws/master/aws -o aws

OPTIONAL. Perform this step if you want to use commands like s3ls and ec2run. If you prefer commands like aws ls or aws run, then you may skip this step. Just make sure to set "aws" to be executable: "chmod +x aws".

Perform the optional install it with

perl aws --install

(This step sets aws to be executable, copies it to /usr/bin if you are root, and then symlinks the aliases, the same as aws --link does.)

Put your AWS credentials in ~/.awssecret: the Access Key ID on the first line and the Secret Access Key on the second line. Example:

1B5JYHPQCXW13GWKHAG2
2GAHKWG3+1wxcqyhpj5b1Ggqc0TIxj21DKkidjfz
Follow thos instructions carefully. Below is the script. There are plenty of comments and the code isn't to difficult to navigate. I set it up so I have 4 backups, 1 daily, 3 weekly (One 10 days old, one 20 days old and one 30 days old). Actually my comment makes more sense:

31 days of rolling log files kept. When the counter reaches 1, 11 and 21 a backup is made.

Code:
#!/bin/bash
#####################################################
#	RWS ROLLING AWS BACKUPs by Paul Kresfelder (relies on http://timkay.com/aws/)
#	Version 1.2 (September 2012)
#
#	Backup daily cpanel backups, and a rolling 10 backup with 3 versions. 
#	Transfer cpanel premade backups to Amazon S3
#	Version 1.2 is less bandwidth intesive as it copies backups from within S3 to new files, instead of from the server to S3
#	Also brags better logging and more detailed terminal running stats.
#####################################################
#   This program is distributed in the hope that it will be useful,
#   but WITHOUT ANY WARRANTY; without even the implied warranty of
#   MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#   GNU General Public License for more details.
#
#   You should see <http://www.gnu.org/licenses/>.
#####################################################


#shell function
do_recent_backup()
{
     bkpname=$1
           /usr/bin/s3put "$bucket"/"$user"-"$bkpname".tar.gz /backup/cpbackup/daily/"$user".tar.gz >> $logfile 2>&1
            echo -e `date '+%D %T'` "Copied "$bkpname" backup to "$user"/"$user"-"$bkpname".tar.gz" >> $logfile
	    echo -e `date '+%D %T'` "Copied "$bkpname" backup to "$user"/"$user"-"$bkpname".tar.gz"
}
#usage
#do_rws_backup $dailyname

do_semester_backup()
{
	segmentbkpname=$1
    
	/usr/bin/s3cp --progress /"$bucket"/"$user"-"$segmentbkpname".tar.gz /"$bucket"/"$user"-"$dailyname".tar.gz >> $logfile 2>&1
	echo -e `date '+%D %T'` "Copied "$segmentbkpname" backup to "$user"/"$user"-"$segmentbkpname".tar.gz" >> $logfile
	echo -e `date '+%D %T'` "Copied "$segmentbkpname" backup to "$user"/"$user"-"$segmentbkpname".tar.gz"
}
#usage
#do_semester_backup $dailyname segment1
#do_semester_backup segment2

echo "RWS ROLLING AWS BACKUPs by Paul Kresfelder"
echo "Version 1.2 (September 2012)"
echo "Backup daily cpanel backups, and a rolling 10 backup with 3 versions. Transfer premade daily cpanel backups to Amazon S3"
echo
echo
echo
echo
echo

(. /home/tools/counterfile
day=$(( $day + 1 ))
if [ $day -eq 31 ]
then day=0
fi
echo "day=$day" > /home/tools/counterfile
. /home/tools/counterfile

################################################
#3 Rolling backups made with 10 day intervals
# Log files for all actions - seperate log files for each day, kept for 30 days
#Define $logfile, $bucket
bucket="your_bucket_name"
logfile=/home/tools/logs/rwsbackup_day"$day".log
#31 days of rolling log files kept. When the counter reaches 1, 11 and 21 a backup is made.
################################################


(if [ -e "$logfile" ];
then
  rm -f "$logfile"
fi)



echo "You may follow full logfile using tail -f " $logfile
echo "Current run Cycle: " $day
echo
echo "Current run Cycle: " $day >> $logfile
echo "Bucket to be used: " $bucket
echo
echo "Bucket to be used: " $bucket >> $logfile

####Used to determine total users, and current user number.
usercount=0
currentuser=0
/usr/bin/find /var/cpanel/users -type f -printf "%f\n" |
while read totalcount; do
let "usercount += 1"
done
echo $usercount
####

/usr/bin/find /var/cpanel/users -type f -printf "%f\n" |
while read user; do
let "currentuser += 1"
echo `date '+%D %T'` "Current user is: "$user " ("$currentuser" of " $usercount ")" >> $logfile
echo `date '+%D %T'` "Current user is: "$user " ("$currentuser" of " $usercount ")"

#Send Daily Backup to S3
#If you want to call it something other than daily. Only change this if you havent run it before, otherwise old files will not be overwritten.
dailyname="daily"
do_recent_backup $dailyname

#Do backups rolling 10 day intervals
(if [ $day -eq 1 ]; then       
do_semester_backup segmant1
	elif [ $day -eq 11 ]; then   
do_semester_backup segmant3
	elif [ $day -eq 21 ]; then     
do_semester_backup segmant2
fi)
done
#Email the log.  Replace [email protected] with your email address.
#$emailaddy="[email protected]"
#$subjectofemail="S3 backup transfer complete"
#$themessage="/home/tools/logs/rwsbackup_day"$day".log"
#echo "$themessage" | mail -s "$subjectofemail" "$emailaddy"
)
Tips:
chmod the file 755
Execute Pre/Post Backup Script /scripts/postcpbackup (just add the file path to this file)

I hope it works for you. I take No respocibility taken for what you do to your own server.
 
Last edited:

3x-a-D3-u5

Member
Nov 26, 2006
11
0
151
Oh yes a cool feature I found for cleaning up old forgotten accounts was the lifecycle rules. Mine deletes backups older than 60 days, so I'm never left with accounts that were terminated from the server but still costing me backup money in s3 buckets.

In the propertis section of the bucket in amazon maganement console: "Lifecycle Rule - Create a lifecycle rule to schedule the archival of objects to Glacier and/or permanent removal of objects. "
 

Infopro

Well-Known Member
May 20, 2003
17,075
524
613
Pennsylvania
cPanel Access Level
Root Administrator
Twitter
Oh yes a cool feature I found for cleaning up old forgotten accounts was the lifecycle rules. Mine deletes backups older than 60 days, so I'm never left with accounts that were terminated from the server but still costing me backup money in s3 buckets.

In the propertis section of the bucket in amazon maganement console: "Lifecycle Rule - Create a lifecycle rule to schedule the archival of objects to Glacier and/or permanent removal of objects. "
I'm thinking we might move your posts out from here and into the Workarounds forum, thoughts?
Workarounds