Help with cron jobs please

Brook

Well-Known Member
Apr 22, 2005
99
0
156
I was wondering if anyone here has experience with cron jobs?

I want to set up a cron to make daily back-ups of my database, but by turning my site off first.

This is how I envisage it to work:

1: rename '.htacess' (in public_html folder for the site) to .htaccess-open
2: rename '.htaccess-closed' to .htaccess
// this closes the site down so no-one can write/access the db

3: mysqldump --opt (DB_NAME) -u (DB_USERNAME) -p(DB_PASSWORD) > /path/to/dbbackup-$(date +%m%d%Y).sql
// this backs up the database

4: wait for 3 to finish
5: rename '.htaccess' to .htaccess-closed
6: rename '.htacess-open' to .htaccess
// this opens the site back up

Is this easy enough to do? Anyone got any tips/pointers?

Thanks in advance!
 
Last edited:

Eric

Well-Known Member
Nov 25, 2007
754
14
143
Texas
cPanel Access Level
Root Administrator
Twitter
Howdy,

It sounds like you're mostly there. You need to stage this right and it'll be a breeze. You've got a lot of the logic thought out so let's get started.

First Let's make a scripts folder to run this all from do that with this command:

cd <enter> (this will get you back to your home folder)
mkdir scripts

in this folder I would put a sample copy of your htaccess-open and htaccess-closed. You don't need to name them .htaccess... so you won't loose them.

Now for the script. I picked bash since a lot of what you have is in bash ready.

Code:
#!/bin/bash
#BackupStore.sh
#purpose to close the store and backup the database
echo "Closing up shop"
cat ~/scripts/htaccess-closed > ~/public_html/.htaccess
echo "Sleeping 10 seconds to let connections die"
sleep 10s
mysqldump --opt (DB_NAME) -u (DB_USERNAME) -p(DB_PASSWORD) > /path/to/dbbackup-$(date +%m%d%Y).sql
sleep 3s
cat ~/scripts/htaccess-open > ~/public_html/.htaccess
Run it in cron as sh scriptname.sh or chmod +x it and ./path/to/scriptname.sh to run it.
 
  • Like
Reactions: Brook

Brook

Well-Known Member
Apr 22, 2005
99
0
156
Wow Eric you are amazing! I will give this a try tomorrow and let you know how it goes.

Just a quick question though, will it wait for the mysqldump to finish before going on to the next step? (Which you have as sleep 3s.) It's just that they take about 30 minutes to complete for one of my databases!

Also, would it be possible to add a further step to gzip the db?

And this is wishful thinking, but how easy would it be to make the script back up once a day, but only keep backups for the last 7 days, then 1 from a week ago and one from a month ago, then another from 3 months ago. I guess that's pushing it a bit eh? :lol:

Thanks again for your help, much appreciated :cool:
 

Eric

Well-Known Member
Nov 25, 2007
754
14
143
Texas
cPanel Access Level
Root Administrator
Twitter
Shell scripting only moves to the next command when that command is done. So you can line them right up and it'll do the jobs for you without need to wait, it's already good for that.

If you wanted to gzip or bzip, just add it to the line after the mysql dump to look like this:

gzip /path/to/dbbackup-$(date +%m%d%Y).sql

or

bzip2 /path/to/dbbackup-$(date +%m%d%Y).sql

I'd try both and see which one gives you better compression on your system.

As for retention, I use (date +%d) for my database backups. This makes one backup for each day of the month. They overwrite each other after a month and due to the factors of months with 31 and 30 days I can even go back few months if needed.

Some people don't have the space for all of this, but it's nice when your brother-in-law calls asking for backups and you get to ask when I've got 30. :)
 

Brook

Well-Known Member
Apr 22, 2005
99
0
156
Thanks Eric, that's a really neat way of doing it :)

Re the space issue, would it be possible to make a back up just once a week instead? (So we are only saving 4 per month) At a guess I'd say simply set the cron to run once a week instead (so it falls on the same day, 1,8,15,29)? That just seems too simple tho so I'm probably wrong! :lol:

Also, is it ok if I write this up into my guide on back ups (include it with the htaccess site shut-down bit) and post it on my site? I'll credit you of course :) I generally write up useful tips like this and store them on pc for personal use, but when I think the guide may help others I post it on one of my sites too as it makes it easier for me to share it with others when I think they may need it.

Any other tips would also be appreciated! (For eg, do you think it would be better to do a mysql dump once a week, and maybe a mysqlhotcopy every 3 days?)

Thanks again for your help, much appreciated!
 

Eric

Well-Known Member
Nov 25, 2007
754
14
143
Texas
cPanel Access Level
Root Administrator
Twitter
Howdy,

If you just want to keep n number of copies rotate back them up in some more generic fashion. Then rotate them before the mysql dump. It'd look something like this below.

rm dbname.bz.4
mv dbname.bz.3 dbname.bz.4
mv dbname.bz.2 dbname.bz.3
mv dbname.bz.1 dbname.bz.2
mv dbname.bz dbname.bz.1

Now run your backup! :cool:

The dates are still no the file status. If you ever wanted to get them ls -l would flush it out.

As for partial backups, I like a simple 1 file = 1 good backup. But that's my personal opinion.

Also I don't backup on server, I scp the files off location as soon as they're backed up.

You're welcome to use anything I've posted, if you mention my name send me link. :)
 

Brook

Well-Known Member
Apr 22, 2005
99
0
156
Sounds good. Shall we go the whole hog and add SCP transfers too? :D

I have no idea where to start there tho as I'm not familiar with SCP at all. I guess it could work with a simple hosting account on someone else's server? Would make for a cheap way to make off-site back-ups :cool:

I hope I am not being cheeky by asking for all this help Eric, I appreciate how busy you might be.
 

Spiral

BANNED
Jun 24, 2005
2,018
8
193
If you wanted to gzip or bzip, just add it to the line after the mysql dump to look like this:

gzip /path/to/dbbackup-$(date +%m%d%Y).sql

or

bzip2 /path/to/dbbackup-$(date +%m%d%Y).sql

I'd try both and see which one gives you better compression on your system.
Just a technical note to answer your question ...

BZIP2 uses much tighter compression than GZIP and will in every case
I have every seen the output is substantially smaller than GZIP.

The disadvantage to BZIP is that it is a slower process so if you are
in a big hurry then GZIP might be better since it is very quick but
if you are looking at space then BZIP2 is definitely the better choice.

ZIP, another option, varies depending on content and may be
slightly smaller or larger than BZIP2 depending on what you are
compressing. Same goes for RAR which tends to have output sizes
very comparable to BZIP2 for most output files.

A newer one, LZIP, is the only one I have seen on standard linux
environments that outputs consistently smaller than BZIP2 but
LZIP is much slower than either GZIP or BZIP2 but it does output
some impressively tight compression.

LZIP isn't added by default with most distributions so you will probably
need to download and compile the source if you want it available on
your server but it is good if you want really tight files that can be
transferred very quickly. Like I said, the disadvantage is that it is
very slow so you probably don't want to build archives on huge files.
For that, BZIP2 is a better compromise between speed and filesize.
 

Brook

Well-Known Member
Apr 22, 2005
99
0
156
Thanks Spiral - how reliable is bzip2?

I did a quick test:

Database Size 788.9 MB

Gzip took 1 minute = 249.3 MB
Bzip2 tool 2 minutes = 185.8 MB

That's a saving of 63.5 MB

And over a 30 day period it would save almost 2GB! (1905 MB) (total 5.5GB instead of 7.5GB)

They both take about a minute to unzip.

I think with that I'll go with the full daily back-ups per month :)
 

Brook

Well-Known Member
Apr 22, 2005
99
0
156
I tried it but it didn't work :(

Here's the contents of backupstore.sh (are the permissions meant to be 512? x x x?)

PHP:
#!/bin/bash
#BackupStore.sh
#purpose to close the store and backup the database
echo "Closing up shop"
cat /home/7-letter-accountname-where-script-is/myscripts/htaccess-closed > /home/account-name-of-site/public_html/.htaccess
echo "Sleeping 10 seconds to let connections die"
sleep 10s
mysqldump --opt db-name -u username -p password > /home/account-to-save-to/public_html/folder1/dbbackup-$(date +%d).sql
sleep 3s
cat /home/7-letter-accountname-where-script-is/myscripts/htaccess-open > /home/account-name-of-site/public_html/.htaccess
Then I set up a cron in cpanel to run at 3.30am. Waited till then (ran the 'date' command to check time was right) but it didn't work (site was still 'on').

I tried it with '~' before the path in the 'cat' lines and without (as shown above). Do I need that btw? (I guessed it was to do with relative paths?)

Can anyone spot what I've done wrong? :-/
 

Eric

Well-Known Member
Nov 25, 2007
754
14
143
Texas
cPanel Access Level
Root Administrator
Twitter
Howdy,

What's your cronjob look like? Could you dump crontab -l and pwd to me or PM it? I'm betting you need the full path to the script to make it work. Also it's going to depend on how you're calling it (the whole sh vs ./ argument).

I <3 bzip, only reason I offered it up, but some folks are all about the g(un)zip. I recommend those two because they're installed on easily on most every *nix system.

Let's get this thing running then we'll peak at the scp aspects of things.

Thanks!
 

Brook

Well-Known Member
Apr 22, 2005
99
0
156
Update:

With help from Eric via PM - we've got it to work :) (I just needed to run everything from within the actual account).

All we need now is how to overwrite the files so we take daily back ups that get overwritten each month, and the SCP bit :D

Thanks for all your help Eric, you're a star!
 

Eric

Well-Known Member
Nov 25, 2007
754
14
143
Texas
cPanel Access Level
Root Administrator
Twitter
Howdy,

You may need to clean things out before you backup for the day. A little rm'ing with the same logic you used to make the file should do the trick. I'll show you how to do that remotely in a second.

Backing up via remote can be done in a lot of ways, ssh, sftp, ftp and on and on. I like scp, but it may not be for you. First thing you need is a remote machine.

On this remote machine you need to make a backup user to accept the backup data. I don't recommend backing up as root or your personal use user. Let's make this a little out of the way for now, so it can't be easily deleted.

Also try and avoid the obvious user backup, I can't count how many times I've seen a brute force script start with root, then admin, then backup. With this user make him a super long password that you're likely to never use. Taking the md5sum of a random logfile works nicely. :D

Now we've got a user to get the files, let's setup the method to authenticate between the two. For that we're going to use SSH keys. I could type this all out and make this lengthy post longer but I'm just going to link to it.

HOWTO: set up ssh keys

Once you can ssh from the cPanel server to the backup server without password interruption you're ready for scp. The scp command should look like this:

scp /path/to/filename username@HostnameOrIP:/path/to/backup/dir

add that on to your backup script and you'll be good. I'd let it run a few days to make sure you're getting the files in both places. Once you're sure it is add an RM line in to clear out the cPanel server to keep your quotas in check.

I mentioned running commands on remote servers and here's how you do that.

ssh user@HostnameOrIP 'remote command'

I few common gotcha points..
-Make sure you're sshing as the right user
-Make sure the authorized_keys and authorized_keys2 files are 644

Enjoy!
 

Brook

Well-Known Member
Apr 22, 2005
99
0
156
Phew, that seems like a lot to pick up - but I will give it a go for sure :cool:

Re the daily back-ups, I've added the -f flag, so bzip2 is forced to overwrite - I figured this way there are two back ups at all times, one on the parent server and another off-site.

Just a quick question about the SSH keys... I was planning on renting some hosting space for the back-up server, so are there any security implications by placing my SSH key on that server for the purpose of these back-ups? For example, if their server got hacked and the hackers got my SSH key, would they be able to connect to my server? (if so, and do what?)

Alternatively, I guess I could scp via my Mac somehow? That way I'll have copies of the back-ups on my own computer.

(I don't really know anything about SSH keys sorry!)
 

Brook

Well-Known Member
Apr 22, 2005
99
0
156
Sorry Eirc, I'm stuck on the SSH/SCP

I thought as I am going to use a hosting company for the back-up server I'll use cpanel to generate the keys (as they're bound to be using cpanel). I followed this: Tutorial to use SSH keys instead of password - Web Hosting Talk

And am able to connect to the host server (currently my second server) via: root@localhost ~]# ssh -i /path-to-file/file.key [email protected]s

But I can't seem to use that to do a back up with the SCP command in your last post: scp /path/to/filename username@HostnameOrIP:/path/to/backup/dir

Do I need to use a different command? Where am I going wrong?
 

Brook

Well-Known Member
Apr 22, 2005
99
0
156
Still stuck on this, but I've played around with it more and seem to have worked out this much (please correct me if I'm wrong).

I create a SSH key on the machine I want to connect _from_

Then I import the public key from that machine to the machine I want to connect _to_

Well I managed that on my Mac, being able to connect to my server. It simply asked me for the passphrase then connected, and now it just simply connects without asking for the passphrase :)

But I tried the same on the server (to server) - created a key on the machine I want to connect from (the host) and then imported the public key into the account of the machine I want to connect to (back-up destination server). So then I su to the host machines account (for that key) and try to ssh [noparse]ac[email protected][/noparse] but then it asks for a password.

I've looked in authorised keys for the destination servers account and my imported key is in there (only thing worth noting is it ends in [noparse]account_name@root.server.hostname[/noparse] instead of account_name@domain).

Any ideas what I'm doing wrong?
 

Brook

Well-Known Member
Apr 22, 2005
99
0
156
Hi Eric, I've just tried it with:

scp -i /path-to-file/file.key /path/to/filename username@HostnameOrIP:/path/to/backup/dir

and it asks for a passphrase then copies the file. I guess I need some way to cache the passphrase so it doesn't ask for it?

If I do it the other way I may not be able to use a normal host as they may not give me ssh access :-/

Are there any other methods we can use apart from SCP?