SOLVED Bash script to find large error_log files? Run as cron?

sneader

Well-Known Member
Aug 21, 2003
1,195
65
178
La Crosse, WI
cPanel Access Level
Root Administrator
My bash skills are lacking. I'm sorry. :)

I'd like to run a cron every week that emails root a list of any large error_log files, perhaps over 50MB. The cause of the errors can be researched, to be more proactive in helping customers, and also may help save disk space (including backup space).

I've got this far... this script, when run as root from the command line, definitely does return a list of error_log files that are over 50MB:

Code:
find /home/*/public_html/ -type f -iname error_log -size +50M -exec du -sh {} \;
I bet that can be improved.

The problem is that when I try that as a cron, it gives all sorts of errors. I've tried putting it into a shell script, and cron'ing the script, but same result.

Waiving white flag. Help, oh smart ones!

- Scott
 

vacancy

Well-Known Member
Sep 20, 2012
523
203
93
Turkey
cPanel Access Level
Root Administrator
You may want to delete all the error_log files without making a size separation with a simpler logic.

Code:
0 2 * * * rm -rf /home/*/public_html/error_log > /dev/null 2>&1
All error_log files are deleted at 02.00 every night.
 

sneader

Well-Known Member
Aug 21, 2003
1,195
65
178
La Crosse, WI
cPanel Access Level
Root Administrator
You may want to delete all the error_log files without making a size separation with a simpler logic.
No, I don't want to automatically delete the files. As I mentioned, I want to proactively help customers troubleshoot any problems that are causing the error_log files in the first place. Plus, I think it is rude to delete these files without notifying customers, as they might be actively troubleshooting their own problems and will not be happy to find their error_log files magically gone.

Thanks, though.

- Scott
 

vacancy

Well-Known Member
Sep 20, 2012
523
203
93
Turkey
cPanel Access Level
Root Administrator
Deleting files does not mean that you have disabled the error_log system.

If there is an error, the files are automatically re-created. You can keep the deletion intervals longer.
 

vacancy

Well-Known Member
Sep 20, 2012
523
203
93
Turkey
cPanel Access Level
Root Administrator
The following command might work.

ATTENTION: Please try first on your test server.

Code:
find /home/*/public_html -size +50M -name "error_log" -exec rm -rf {} \;
 

sneader

Well-Known Member
Aug 21, 2003
1,195
65
178
La Crosse, WI
cPanel Access Level
Root Administrator
I have this working now, for anyone that is interested in getting a regular report of large error_log files:

First, create this script, name it find-big-errorlogs.sh (or whatever) and put it somewhere handy (maybe /root/ ?). Set it to 755 permissions.

Code:
#!/bin/bash
find /home/*/public_html/ -type f -iname error_log -size +50M -exec du -sh {} \; | /bin/mail -s "myserver Big error_logs" [email protected]
Replace 'myserver' with maybe the hostname of your server, to remind you which server this report is for.
Replace '[email protected]' with whatever email address you want to receive the report.
Replace '50M' with whatever size you want to look for. If you want only 100MB or larger files, change it to 100M.

Next, add the script to your crontab. Just run crontab -e, and to have it run at 3AM every Sunday morning, add something like:

Code:
#!/bin/bash
0 3 * * 6  /root/find-big-errorlogs.sh
Now you will get a list of large error_logs, to do with as you see fit.

- Scott
 

cPanelMichael

Administrator
Staff member
Apr 11, 2011
47,903
2,237
463
Hello Scott,

I'm happy to see you found a viable solution. Thank you for updating this thread with the outcome.