The Community Forums

Interact with an entire community of cPanel & WHM users!
  1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

how to chart your backup times

Discussion in 'cPanel Developers' started by Erik Knepfler, Nov 27, 2016.

Tags:
  1. Erik Knepfler

    Erik Knepfler Member

    Joined:
    Sep 2, 2014
    Messages:
    11
    Likes Received:
    0
    Trophy Points:
    1
    Location:
    Long Beach, California, United States
    cPanel Access Level:
    Root Administrator
    I needed a solution to chart my backup times for day to day comparison. So I wrote one, and thought I'd share it here. This takes a collection of cpbackup logfiles located in the same directory of the script and outputs them to a CSV. I'm partial to Excel PivotTables so it's set up to work well with that. See attached PNG for example of chart. Tips on exactly how to make that chart at the bottom. Hope this helps someone!

    Code:
    ## cPanel Backup Times to CSV
    ## Erik Knepfler, HaveAByte.com
    ## License: https://creativecommons.org/licenses/by/4.0/
    
    # Usage:  Place a collection of cpbackup logfiles (usually in /usr/local/cpanel/logs/cpbackup) into same folder as this script and run it.  Redirect to file with perl cpbackup-times.pl > output.csv
    
    use strict; # Always a good idea
    use Time::Local; # for time conversions
    
    # push all logfiles into @logs
    my @logs;
    opendir(DIR, ".");
    while ($_ = readdir(DIR)) {
       push(@logs, $_) if /\.log/i;
    }
    
    # beginning of CSV output.
    print "backupdate,user,start,totalseconds\n";
    
    # parse each logfile
    for my $logfile (@logs) {
       my %data; # to store everything we find in the logs.  local so we overwrite on each loop
       open(FILE, $logfile) || die $!;
       $_ = <FILE>; # grab the first line in order to pull an official backup start time.
       /(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})/; # date match regexp.  stores into $1 of course
       my $backupdate = $1;
       while ($_ = <FILE>) { # read rest of file
          if (/ user \: (.*?) /) { # get the user(account) being backed up
             my $user = $1;
             /(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})/; # $1 = datetime this user backup started
             $data{$user}{'start'} = tc($1); # backup start time.  converted with tc()
             until (/pkgacct completed/ || eof) { # note:  somewhat dangerous if cpbackup file if "pkgacct completed" never appears for a particular backup. would probably manifest as combining two backup accounts into one.
                $_ = <FILE>; # keep reading the file until we match
             }
             /(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})/;
             $data{$user}{'end'} = tc($1); # backup ended at this time, also converted with tc()
          }
       }
       close(FILE);
       # done reading file, sort by user and output single CSV line for each user
       for (sort(keys(%data))) {
          print "$backupdate, $_, " . localtime($data{$_}{'start'}) . "," .  ($data{$_}{'end'} - $data{$_}{'start'}) . "\n";
       }
    }
    
    sub d { print $_[0] . "\n";} # for debug output
    
    sub tc { # time convert 2016-11-16 23:00:02 -0800 format to seconds, ignores tz
       my $datetime = $_[0];
       my ($year, $mon, $mday, $hour,$min,$sec) = split(/[\s.:-]+/, $datetime);
       my $t = timelocal($sec,$min,$hour,$mday,$mon-1,$year);
       return($t);
    }
    
    # Excel charting tips:
    # insert a table around data
    # click table
    # insert > pivottable
    # in pivottable:
    # put backupdate into legend(series)
    # put user into axis(categories)
    # put totalseconds into values
    # sort row labels by sum of totalseconds ascending
    # click table and insert a line pivottchart
    
    
     

    Attached Files:

  2. cPJacob

    cPJacob cPanel Product Owner
    Staff Member

    Joined:
    May 2, 2014
    Messages:
    599
    Likes Received:
    90
    Trophy Points:
    103
    cPanel Access Level:
    DataCenter Provider
    Twitter:
    Erik,

    Thanks for sharing! I'll use this myself :)
     
Loading...

Share This Page