In the past I've been using wget running on various folders within my www structure to synchronise my data accross two RRDNS servers.
EG: news item images get synced every 15mins or so
videos every 30, ads every 1hr etc etc
Doesnt make sense to do all of them at once as theres several GB of data.
Thing is, for some reason wget wont timestamp properly (it used to), works on images, but the videos and mp3s always d/l in entirety every time.
I've tried this between several servers both ways and I still get problems so I dont think its a specific box that has the problem
heres my bash script:
cd /home/wheremystuff is
wget -nH -r -I/public_html/vids/ -N ftp://user:pass@myip/public_html/
exit
The scripts are the same apart from the -I switch to specify the folders to grab
Anyone else had trouble?
And the next part is if this isnt working, how do I get around this? I'm told rsync might do it, but again I need to cron to the processes on a folder by folder basis and am told this is tricky with timestamping
many thanks for any help you can shed on this
EG: news item images get synced every 15mins or so
videos every 30, ads every 1hr etc etc
Doesnt make sense to do all of them at once as theres several GB of data.
Thing is, for some reason wget wont timestamp properly (it used to), works on images, but the videos and mp3s always d/l in entirety every time.
I've tried this between several servers both ways and I still get problems so I dont think its a specific box that has the problem
heres my bash script:
cd /home/wheremystuff is
wget -nH -r -I/public_html/vids/ -N ftp://user:pass@myip/public_html/
exit
The scripts are the same apart from the -I switch to specify the folders to grab
Anyone else had trouble?
And the next part is if this isnt working, how do I get around this? I'm told rsync might do it, but again I need to cron to the processes on a folder by folder basis and am told this is tricky with timestamping
many thanks for any help you can shed on this