Auto-cleanup for Amazon s3 buckets:
A cool feature I found for cleaning up old forgotten accounts was the lifecycle rules. Mine deletes backups older than 60 days, so I'm never left with accounts that were terminated from the server but still costing me backup money in s3 buckets.
In the propertis section of the bucket in amazon maganement console: "Lifecycle Rule - Create a lifecycle rule to schedule the archival of objects to Glacier and/or permanent removal of objects. "
Legacy cPanel backups copied to Amazon S3
wrote this script a while back, I've been using it for over a year now, but going to rewrite to take advantage of the new backup system. Currently this uses the legacy cPanel system. It should do the trick:
31 days of rolling log files kept. When the counter reaches 1, 11 and 21 a backup is made.
Install Tips:
I hope it works for you. I take No respocibility taken for what you do to your own server.
A cool feature I found for cleaning up old forgotten accounts was the lifecycle rules. Mine deletes backups older than 60 days, so I'm never left with accounts that were terminated from the server but still costing me backup money in s3 buckets.
In the propertis section of the bucket in amazon maganement console: "Lifecycle Rule - Create a lifecycle rule to schedule the archival of objects to Glacier and/or permanent removal of objects. "
Legacy cPanel backups copied to Amazon S3
wrote this script a while back, I've been using it for over a year now, but going to rewrite to take advantage of the new backup system. Currently this uses the legacy cPanel system. It should do the trick:
Follow thos instructions carefully. Below is the script. There are plenty of comments and the code isn't to difficult to navigate. I set it up so I have 4 backups, 1 daily, 3 weekly (One 10 days old, one 20 days old and one 30 days old). Actually my comment makes more sense:You need this perl library installed: aws - simple access to Amazon EC2 and S3
To use aws, follow these steps:
Install curl if necessary (either apt-get install curl or yum install curl)
Download aws to your computer
curl https://raw.github.com/timkay/aws/master/aws -o aws
OPTIONAL. Perform this step if you want to use commands like s3ls and ec2run. If you prefer commands like aws ls or aws run, then you may skip this step. Just make sure to set "aws" to be executable: "chmod +x aws".
Perform the optional install it with
perl aws --install
(This step sets aws to be executable, copies it to /usr/bin if you are root, and then symlinks the aliases, the same as aws --link does.)
Put your AWS credentials in ~/.awssecret: the Access Key ID on the first line and the Secret Access Key on the second line. Example:
1B5JYHPQCXW13GWKHAG2
2GAHKWG3+1wxcqyhpj5b1Ggqc0TIxj21DKkidjfz
31 days of rolling log files kept. When the counter reaches 1, 11 and 21 a backup is made.
Install Tips:
- chmod the file 755
- Execute from Pre/Post Backup Script /scripts/postcpbackup (just add it to that file as path to this file)
- This file is referenced as rwsbackup.1.2.sh so call it as that when you mk file
I hope it works for you. I take No respocibility taken for what you do to your own server.
Code:
#!/bin/bash
#####################################################
# RWS ROLLING AWS BACKUPs by 3x-a-D3-u5 (relies on http://timkay.com/aws/)
# Version 1.2 (September 2012)
#
# Backup daily cpanel backups, and a rolling 10 backup with 3 versions.
# Transfer cpanel premade backups to Amazon S3
# Version 1.2 is less bandwidth intesive as it copies backups from within S3 to new files, instead of from the server to S3
# Also brags better logging and more detailed terminal running stats.
#####################################################
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should see <http://www.gnu.org/licenses/>.
#####################################################
#shell function
do_recent_backup()
{
bkpname=$1
/usr/bin/s3put "$bucket"/"$user"-"$bkpname".tar.gz /backup/cpbackup/daily/"$user".tar.gz >> $logfile 2>&1
echo -e `date '+%D %T'` "Copied "$bkpname" backup to "$user"/"$user"-"$bkpname".tar.gz" >> $logfile
echo -e `date '+%D %T'` "Copied "$bkpname" backup to "$user"/"$user"-"$bkpname".tar.gz"
}
#usage
#do_rws_backup $dailyname
do_semester_backup()
{
segmentbkpname=$1
/usr/bin/s3cp --progress /"$bucket"/"$user"-"$segmentbkpname".tar.gz /"$bucket"/"$user"-"$dailyname".tar.gz >> $logfile 2>&1
echo -e `date '+%D %T'` "Copied "$segmentbkpname" backup to "$user"/"$user"-"$segmentbkpname".tar.gz" >> $logfile
echo -e `date '+%D %T'` "Copied "$segmentbkpname" backup to "$user"/"$user"-"$segmentbkpname".tar.gz"
}
#usage
#do_semester_backup $dailyname segment1
#do_semester_backup segment2
echo "RWS ROLLING AWS BACKUPs by 3x-a-D3-u5"
echo "Version 1.2 (September 2012)"
echo "Backup daily cpanel backups, and a rolling 10 backup with 3 versions. Transfer premade daily cpanel backups to Amazon S3"
echo
echo
echo
echo
echo
(. /home/tools/counterfile
day=$(( $day + 1 ))
if [ $day -eq 31 ]
then day=0
fi
echo "day=$day" > /home/tools/counterfile
. /home/tools/counterfile
################################################
#3 Rolling backups made with 10 day intervals
# Log files for all actions - seperate log files for each day, kept for 30 days
#Define $logfile, $bucket
bucket="your_bucket_name"
logfile=/home/tools/logs/rwsbackup_day"$day".log
#31 days of rolling log files kept. When the counter reaches 1, 11 and 21 a backup is made.
################################################
(if [ -e "$logfile" ];
then
rm -f "$logfile"
fi)
echo "You may follow full logfile using tail -f " $logfile
echo "Current run Cycle: " $day
echo
echo "Current run Cycle: " $day >> $logfile
echo "Bucket to be used: " $bucket
echo
echo "Bucket to be used: " $bucket >> $logfile
####Used to determine total users, and current user number.
usercount=0
currentuser=0
/usr/bin/find /var/cpanel/users -type f -printf "%f\n" |
while read totalcount; do
let "usercount += 1"
done
echo $usercount
####
/usr/bin/find /var/cpanel/users -type f -printf "%f\n" |
while read user; do
let "currentuser += 1"
echo `date '+%D %T'` "Current user is: "$user " ("$currentuser" of " $usercount ")" >> $logfile
echo `date '+%D %T'` "Current user is: "$user " ("$currentuser" of " $usercount ")"
#Send Daily Backup to S3
#If you want to call it something other than daily. Only change this if you havent run it before, otherwise old files will not be overwritten.
dailyname="daily"
do_recent_backup $dailyname
#Do backups rolling 10 day intervals
(if [ $day -eq 1 ]; then
do_semester_backup segmant1
elif [ $day -eq 11 ]; then
do_semester_backup segmant3
elif [ $day -eq 21 ]; then
do_semester_backup segmant2
fi)
done
#Email the log. Replace [email protected] with your email address.
#$emailaddy="[email protected]"
#$subjectofemail="S3 backup transfer complete"
#$themessage="/home/tools/logs/rwsbackup_day"$day".log"
#echo "$themessage" | mail -s "$subjectofemail" "$emailaddy"
)
Last edited by a moderator: