Nate Reist

Member
Jul 20, 2018
20
4
3
Michigan
cPanel Access Level
Root Administrator
I'm also having an issue with cp backup transport on two servers. For me I'm trying to send to S3 - the process seems to be super slow and gets back logged, then the server ends up too full to continue. My issue started just a few days ago. I'm happy to contribute log files or any details to help you come up with a resolution!

Thanks.
 

Nate Reist

Member
Jul 20, 2018
20
4
3
Michigan
cPanel Access Level
Root Administrator
It seems to be just slow like up to a few hours to upload a few hundred mb gzip. Below is an excerpt from the log, the last one never completed ( I killed the process and cleared out the backups we were out of storage )

Code:
[2020-03-05 02:06:21 -0800] info [cpbackup_transporter] Starting a "copy" operation on the "S3 Backups" destination ID "{ID OMITTED}".
[2020-03-05 02:06:21 -0800] info [cpbackup_transporter] Validating destination path /dv02backups/2020-03-05/accounts/
[2020-03-05 02:06:21 -0800] info [cpbackup_transporter] Path exists
[2020-03-05 02:06:21 -0800] info [cpbackup_transporter] Uploading account backup /backup/2020-03-05/accounts/wrpglobal.tar.gz to /dv02backups/2020-03-05/accounts/wrpglobal.tar.gz (from 2020-03-05/accounts/wrpglobal.tar.gz)
[2020-03-05 02:06:21 -0800] info [cpbackup_transporter] Attempting to upload /backup/2020-03-05/accounts/wrpglobal.tar.gz to /dv02backups/2020-03-05/accounts/wrpglobal.tar.gz for destination:  S3 Backups
[2020-03-05 02:06:21 -0800] info [cpbackup_transporter] Upload attempt #1 starting for /backup/2020-03-05/accounts/wrpglobal.tar.gz to /dv02backups/2020-03-05/accounts/wrpglobal.tar.gz for destination:  S3 Backups
[2020-03-05 02:42:59 -0800] info [cpbackup_transporter] Successful transfer of /backup/2020-03-05/accounts/wrpglobal.tar.gz to /dv02backups/2020-03-05/accounts/wrpglobal.tar.gz for destination S3 Backups
[2020-03-05 02:42:59 -0800] info [cpbackup_transporter] The backup has been successfully uploaded at least once, now we will delete the local copy (/backup/2020-03-05/accounts/wrpglobal.tar.gz) since keeplocal (0) is disabled.
[2020-03-05 02:42:59 -0800] info [cpbackup_transporter] cpbackup_transporter - Processing next task
[2020-03-05 02:42:59 -0800] info [cpbackup_transporter] Instantiating Object
[2020-03-05 02:43:00 -0800] info [cpbackup_transporter] Starting a "copy" operation on the "S3 Backups" destination ID "{ID OMITTED}".
[2020-03-05 02:43:00 -0800] info [cpbackup_transporter] Validating destination path /dv02backups/2020-03-05/accounts/
[2020-03-05 02:43:00 -0800] info [cpbackup_transporter] Path exists
[2020-03-05 02:43:00 -0800] info [cpbackup_transporter] Uploading account backup /backup/2020-03-05/accounts/kingdomfp.tar.gz to /dv02backups/2020-03-05/accounts/kingdomfp.tar.gz (from 2020-03-05/accounts/kingdomfp.tar.gz)
[2020-03-05 02:43:00 -0800] info [cpbackup_transporter] Attempting to upload /backup/2020-03-05/accounts/kingdomfp.tar.gz to /dv02backups/2020-03-05/accounts/kingdomfp.tar.gz for destination:  S3 Backups
[2020-03-05 02:43:00 -0800] info [cpbackup_transporter] Upload attempt #1 starting for /backup/2020-03-05/accounts/kingdomfp.tar.gz to /dv02backups/2020-03-05/accounts/kingdomfp.tar.gz for destination:  S3 Backups
[2020-03-05 02:50:38 -0800] info [cpbackup_transporter] Successful transfer of /backup/2020-03-05/accounts/kingdomfp.tar.gz to /dv02backups/2020-03-05/accounts/kingdomfp.tar.gz for destination S3 Backups
[2020-03-05 02:50:38 -0800] info [cpbackup_transporter] The backup has been successfully uploaded at least once, now we will delete the local copy (/backup/2020-03-05/accounts/kingdomfp.tar.gz) since keeplocal (0) is disabled.
[2020-03-05 02:50:38 -0800] info [cpbackup_transporter] cpbackup_transporter - Processing next task
[2020-03-05 02:50:38 -0800] info [cpbackup_transporter] Instantiating Object
[2020-03-05 02:50:38 -0800] info [cpbackup_transporter] Starting a "copy" operation on the "S3 Backups" destination ID "{ID OMITTED}".
[2020-03-05 02:50:38 -0800] info [cpbackup_transporter] Validating destination path /dv02backups/2020-03-05/accounts/
[2020-03-05 02:50:39 -0800] info [cpbackup_transporter] Path exists
[2020-03-05 02:50:39 -0800] info [cpbackup_transporter] Uploading account backup /backup/2020-03-05/accounts/ywcawcmi.tar.gz to /dv02backups/2020-03-05/accounts/ywcawcmi.tar.gz (from 2020-03-05/accounts/ywcawcmi.tar.gz)
[2020-03-05 02:50:39 -0800] info [cpbackup_transporter] Attempting to upload /backup/2020-03-05/accounts/ywcawcmi.tar.gz to /dv02backups/2020-03-05/accounts/ywcawcmi.tar.gz for destination:  S3 Backups
[2020-03-05 02:50:39 -0800] info [cpbackup_transporter] Upload attempt #1 starting for /backup/2020-03-05/accounts/ywcawcmi.tar.gz to /dv02backups/2020-03-05/accounts/ywcawcmi.tar.gz for destination:  S3 Backups
[2020-03-05 05:04:35 -0800] info [cpbackup_transporter] Successful transfer of /backup/2020-03-05/accounts/ywcawcmi.tar.gz to /dv02backups/2020-03-05/accounts/ywcawcmi.tar.gz for destination S3 Backups
[2020-03-05 05:04:35 -0800] info [cpbackup_transporter] The backup has been successfully uploaded at least once, now we will delete the local copy (/backup/2020-03-05/accounts/ywcawcmi.tar.gz) since keeplocal (0) is disabled.
[2020-03-05 05:04:35 -0800] info [cpbackup_transporter] cpbackup_transporter - Processing next task
[2020-03-05 05:04:35 -0800] info [cpbackup_transporter] Instantiating Object
[2020-03-05 05:04:35 -0800] info [cpbackup_transporter] Starting a "copy" operation on the "S3 Backups" destination ID "{ID OMITTED}".
[2020-03-05 05:04:35 -0800] info [cpbackup_transporter] Validating destination path /dv02backups/2020-03-05/accounts/
[2020-03-05 05:04:36 -0800] info [cpbackup_transporter] Path exists
[2020-03-05 05:04:36 -0800] info [cpbackup_transporter] Uploading account backup /backup/2020-03-05/accounts/issacharfund.tar.gz to /dv02backups/2020-03-05/accounts/issacharfund.tar.gz (from 2020-03-05/accounts/issacharfund.tar.gz)
[2020-03-05 05:04:36 -0800] info [cpbackup_transporter] Attempting to upload /backup/2020-03-05/accounts/issacharfund.tar.gz to /dv02backups/2020-03-05/accounts/issacharfund.tar.gz for destination:  S3 Backups
[2020-03-05 05:04:36 -0800] info [cpbackup_transporter] Upload attempt #1 starting for /backup/2020-03-05/accounts/issacharfund.tar.gz to /dv02backups/2020-03-05/accounts/issacharfund.tar.gz for destination:  S3 Backups
[2020-03-05 05:57:12 -0800] info [cpbackup_transporter] Successful transfer of /backup/2020-03-05/accounts/issacharfund.tar.gz to /dv02backups/2020-03-05/accounts/issacharfund.tar.gz for destination S3 Backups
[2020-03-05 05:57:12 -0800] info [cpbackup_transporter] The backup has been successfully uploaded at least once, now we will delete the local copy (/backup/2020-03-05/accounts/issacharfund.tar.gz) since keeplocal (0) is disabled.
[2020-03-05 05:57:12 -0800] info [cpbackup_transporter] cpbackup_transporter - Processing next task
[2020-03-05 05:57:12 -0800] info [cpbackup_transporter] Instantiating Object
[2020-03-05 05:57:12 -0800] info [cpbackup_transporter] Starting a "copy" operation on the "S3 Backups" destination ID "{ID OMITTED}".
[2020-03-05 05:57:12 -0800] info [cpbackup_transporter] Validating destination path /dv02backups/2020-03-05/accounts/
[2020-03-05 05:57:13 -0800] info [cpbackup_transporter] Path exists
[2020-03-05 05:57:13 -0800] info [cpbackup_transporter] Uploading account backup /backup/2020-03-05/accounts/sourcit.tar.gz to /dv02backups/2020-03-05/accounts/sourcit.tar.gz (from 2020-03-05/accounts/sourcit.tar.gz)
[2020-03-05 05:57:13 -0800] info [cpbackup_transporter] Attempting to upload /backup/2020-03-05/accounts/sourcit.tar.gz to /dv02backups/2020-03-05/accounts/sourcit.tar.gz for destination:  S3 Backups
[2020-03-05 05:57:13 -0800] info [cpbackup_transporter] Upload attempt #1 starting for /backup/2020-03-05/accounts/sourcit.tar.gz to /dv02backups/2020-03-05/accounts/sourcit.tar.gz for destination:  S3 Backups
[2020-03-05 06:33:57 -0800] info [cpbackup_transporter] Successful transfer of /backup/2020-03-05/accounts/sourcit.tar.gz to /dv02backups/2020-03-05/accounts/sourcit.tar.gz for destination S3 Backups
[2020-03-05 06:33:57 -0800] info [cpbackup_transporter] The backup has been successfully uploaded at least once, now we will delete the local copy (/backup/2020-03-05/accounts/sourcit.tar.gz) since keeplocal (0) is disabled.
[2020-03-05 06:33:57 -0800] info [cpbackup_transporter] cpbackup_transporter - Processing next task
[2020-03-05 06:33:57 -0800] info [cpbackup_transporter] Instantiating Object
[2020-03-05 06:33:57 -0800] info [cpbackup_transporter] Starting a "copy" operation on the "S3 Backups" destination ID "{ID OMITTED}".
[2020-03-05 06:33:57 -0800] info [cpbackup_transporter] Validating destination path /dv02backups/2020-03-05/accounts/
[2020-03-05 06:33:58 -0800] info [cpbackup_transporter] Path exists
[2020-03-05 06:33:58 -0800] info [cpbackup_transporter] Uploading account backup /backup/2020-03-05/accounts/moorsgolf.tar.gz to /dv02backups/2020-03-05/accounts/moorsgolf.tar.gz (from 2020-03-05/accounts/moorsgolf.tar.gz)
[2020-03-05 06:33:58 -0800] info [cpbackup_transporter] Attempting to upload /backup/2020-03-05/accounts/moorsgolf.tar.gz to /dv02backups/2020-03-05/accounts/moorsgolf.tar.gz for destination:  S3 Backups
[2020-03-05 06:33:58 -0800] info [cpbackup_transporter] Upload attempt #1 starting for /backup/2020-03-05/accounts/moorsgolf.tar.gz to /dv02backups/2020-03-05/accounts/moorsgolf.tar.gz for destination:  S3 Backups
[2020-03-05 09:18:53 -0800] info [cpbackup_transporter] Successful transfer of /backup/2020-03-05/accounts/moorsgolf.tar.gz to /dv02backups/2020-03-05/accounts/moorsgolf.tar.gz for destination S3 Backups
[2020-03-05 09:18:53 -0800] info [cpbackup_transporter] The backup has been successfully uploaded at least once, now we will delete the local copy (/backup/2020-03-05/accounts/moorsgolf.tar.gz) since keeplocal (0) is disabled.
[2020-03-05 09:18:54 -0800] info [cpbackup_transporter] cpbackup_transporter - Processing next task
[2020-03-05 09:18:54 -0800] info [cpbackup_transporter] Instantiating Object
[2020-03-05 09:18:54 -0800] info [cpbackup_transporter] Starting a "copy" operation on the "S3 Backups" destination ID "{ID OMITTED}".
[2020-03-05 09:18:54 -0800] info [cpbackup_transporter] Validating destination path /dv02backups/2020-03-05/accounts/
[2020-03-05 09:18:54 -0800] info [cpbackup_transporter] Path exists
[2020-03-05 09:18:54 -0800] info [cpbackup_transporter] Uploading account backup /backup/2020-03-05/accounts/tommccarthyassoc.tar.gz to /dv02backups/2020-03-05/accounts/tommccarthyassoc.tar.gz (from 2020-03-05/accounts/tommccarthyassoc.tar.gz)
[2020-03-05 09:18:54 -0800] info [cpbackup_transporter] Attempting to upload /backup/2020-03-05/accounts/tommccarthyassoc.tar.gz to /dv02backups/2020-03-05/accounts/tommccarthyassoc.tar.gz for destination:  S3 Backups
[2020-03-05 09:18:54 -0800] info [cpbackup_transporter] Upload attempt #1 starting for /backup/2020-03-05/accounts/tommccarthyassoc.tar.gz to /dv02backups/2020-03-05/accounts/tommccarthyassoc.tar.gz for destination:  S3 Backups
 

Nate Reist

Member
Jul 20, 2018
20
4
3
Michigan
cPanel Access Level
Root Administrator
Can you think of any reason two separate servers would be having the same issue transferring to S3? This just started and I am at a loss as to what could be causing that.

The process is running I can trace it in the process manager in WHM, it is repeatedly outputting information like this:

Code:
select(16, [11], [11], NULL, {28, 175601}) = 1 (out [11], left {27, 654504})
write(11, "\27\3\[email protected]\30,,!\346 \313)y\324C\341 w\373n\252\313\"{\327\241\237\352\2606\332\33"..., 16413) = 16413
Any help is appreciated!
 

Nate Reist

Member
Jul 20, 2018
20
4
3
Michigan
cPanel Access Level
Root Administrator
I've moved this into an S3 compatible backup destination configuration ( Spaces on Digital Ocean ) and the speed issue persists, so it would seem it is not an issue with S3 specifically. It just can't keep up with how fast the server fills up.
 

Nate Reist

Member
Jul 20, 2018
20
4
3
Michigan
cPanel Access Level
Root Administrator
Is there any where to review the actual content of this error on the server? 403 can be a lot of different errors with s3

Code:
[2020-03-25 07:54:09 -0700] warn [cpbackup_transporter] Upload attempt failed: Amazon::S3: Amazon responded with 403 Forbidden
 at /usr/local/cpanel/3rdparty/perl/530/lib/perl5/cpanel_lib/Amazon/S3/Bucket.pm line 136.
 at /usr/local/cpanel/Cpanel/LoggerAdapter.pm line 27.
    Cpanel::LoggerAdapter::warn(Cpanel::LoggerAdapter=HASH(0x1f6c810), "Upload attempt failed: Amazon::S3: Amazon responded with 403 "...) called at /usr/local/cpanel/Cpanel/Backup/Queue.pm line 658
    Cpanel::Backup::Queue::transport_backup::attempt_to_upload_file(Cpanel::Backup::Queue::transport_backup=HASH(0x1d6e1b0), Cpanel::Transport::Files::S3Compatible=HASH(0x3e10db0), "/backup/2020-03-25/accounts/triangleinc.tar.gz", "/dv02backups/2020-03-25/accounts/triangleinc.tar.gz", Cpanel::LoggerAdapter=HASH(0x1f6c810)) called at /usr/local/cpanel/Cpanel/Backup/Queue.pm line 333
    Cpanel::Backup::Queue::transport_backup::process_task(Cpanel::Backup::Queue::transport_backup=HASH(0x1d6e1b0), Cpanel::TaskQueue::Task=HASH(0x3db11a0), Cpanel::LoggerAdapter=HASH(0x1f6c810)) called at /usr/local/cpanel/Cpanel/TaskQueue.pm line 635
    eval {...} called at /usr/local/cpanel/Cpanel/TaskQueue.pm line 638
    Cpanel::TaskQueue::__ANON__() called at /usr/local/cpanel/Cpanel/StateFile.pm line 241
    eval {...} called at /usr/local/cpanel/Cpanel/StateFile.pm line 241
    Cpanel::StateFile::Guard::call_unlocked(Cpanel::StateFile::Guard=HASH(0x3d81728), CODE(0x3d8a7f0)) called at /usr/local/cpanel/Cpanel/TaskQueue.pm line 643
    Cpanel::TaskQueue::process_next_task(Cpanel::TaskQueue=HASH(0x2dfe308)) called at /usr/local/cpanel/bin/cpbackup_transporter line 168
    eval {...} called at /usr/local/cpanel/bin/cpbackup_transporter line 166
 

cPanelLauren

Forums Analyst II
Staff member
Nov 14, 2017
10,088
877
313
Houston
Are both/all of the servers you're transporting from on the same network?


The error you're getting specifically: Upload attempt failed: Amazon::S3: Amazon responded with 403 Forbidden indicates a permissions error. Are any transports to that server working from this server?
 

Nate Reist

Member
Jul 20, 2018
20
4
3
Michigan
cPanel Access Level
Root Administrator
Both servers are on mediatemple's DV infrastructure not sure how their networking is configured, but I would guess they are on the same network.



Yes, a lot of the transports do work. in fact even that same package transferred on attempt #2. The problem is 403 isn't just permissions error - they use that error code for a few of possible errors - so if I can review the actual response that might help solve this issue.

Here is a bit more of that log, account1 (185Mb) took 1 hour 15 minutes to transfer , account2 (1.9Gb) failed after 34 minutes and successfully transferred on the second attempt in 2 minutes:

Code:
[2020-03-25 06:05:40 -0700] info [cpbackup_transporter] Instantiating Object
[2020-03-25 06:05:40 -0700] info [cpbackup_transporter] Starting a "copy" operation on the "Spaces backup storage" destination ID "IDOMITTED".
[2020-03-25 06:05:40 -0700] info [cpbackup_transporter] Validating destination path /dv02backups/2020-03-25/accounts/
[2020-03-25 06:05:41 -0700] info [cpbackup_transporter] Path exists
[2020-03-25 06:05:41 -0700] info [cpbackup_transporter] Uploading account backup /backup/2020-03-25/accounts/account1.tar.gz to /dv02backups/2020-03-25/accounts/account1.tar.gz (from 2020-03-25/accounts/account1.tar.gz) 
[2020-03-25 06:05:41 -0700] info [cpbackup_transporter] Attempting to upload /backup/2020-03-25/accounts/account1.tar.gz to /dv02backups/2020-03-25/accounts/account1.tar.gz for destination:  Spaces backup storage
[2020-03-25 06:05:41 -0700] info [cpbackup_transporter] Upload attempt #1 starting for /backup/2020-03-25/accounts/account1.tar.gz to /dv02backups/2020-03-25/accounts/account1.tar.gz for destination:  Spaces backup storage
[2020-03-25 07:20:57 -0700] info [cpbackup_transporter] Successful transfer of /backup/2020-03-25/accounts/account1.tar.gz to /dv02backups/2020-03-25/accounts/account1.tar.gz for destination Spaces backup storage
[2020-03-25 07:20:57 -0700] info [cpbackup_transporter] The backup has been successfully uploaded at least once, now we will delete the local copy (/backup/2020-03-25/accounts/account1.tar.gz) since keeplocal (0) is disabled.
[2020-03-25 07:20:57 -0700] info [cpbackup_transporter] cpbackup_transporter - Processing next task
[2020-03-25 07:20:57 -0700] info [cpbackup_transporter] Instantiating Object
[2020-03-25 07:20:57 -0700] info [cpbackup_transporter] Starting a "copy" operation on the "Spaces backup storage" destination ID "IDOMITTED".
[2020-03-25 07:20:57 -0700] info [cpbackup_transporter] Validating destination path /dv02backups/2020-03-25/accounts/
[2020-03-25 07:20:57 -0700] info [cpbackup_transporter] Path exists
[2020-03-25 07:20:57 -0700] info [cpbackup_transporter] Uploading account backup /backup/2020-03-25/accounts/account2.tar.gz to /dv02backups/2020-03-25/accounts/account2.tar.gz (from 2020-03-25/accounts/account2.tar.gz) 
[2020-03-25 07:20:57 -0700] info [cpbackup_transporter] Attempting to upload /backup/2020-03-25/accounts/account2.tar.gz to /dv02backups/2020-03-25/accounts/account2.tar.gz for destination:  Spaces backup storage
[2020-03-25 07:20:57 -0700] info [cpbackup_transporter] Upload attempt #1 starting for /backup/2020-03-25/accounts/account2.tar.gz to /dv02backups/2020-03-25/accounts/account2.tar.gz for destination:  Spaces backup storage
[2020-03-25 07:54:09 -0700] warn [cpbackup_transporter] Upload attempt failed: Amazon::S3: Amazon responded with 403 Forbidden
 at /usr/local/cpanel/3rdparty/perl/530/lib/perl5/cpanel_lib/Amazon/S3/Bucket.pm line 136.
 at /usr/local/cpanel/Cpanel/LoggerAdapter.pm line 27.
    Cpanel::LoggerAdapter::warn(Cpanel::LoggerAdapter=HASH(0x1f6c810), "Upload attempt failed: Amazon::S3: Amazon responded with 403 "...) called at /usr/local/cpanel/Cpanel/Backup/Queue.pm line 658
    Cpanel::Backup::Queue::transport_backup::attempt_to_upload_file(Cpanel::Backup::Queue::transport_backup=HASH(0x1d6e1b0), Cpanel::Transport::Files::S3Compatible=HASH(0x3e10db0), "/backup/2020-03-25/accounts/account2.tar.gz", "/dv02backups/2020-03-25/accounts/account2.tar.gz", Cpanel::LoggerAdapter=HASH(0x1f6c810)) called at /usr/local/cpanel/Cpanel/Backup/Queue.pm line 333
    Cpanel::Backup::Queue::transport_backup::process_task(Cpanel::Backup::Queue::transport_backup=HASH(0x1d6e1b0), Cpanel::TaskQueue::Task=HASH(0x3db11a0), Cpanel::LoggerAdapter=HASH(0x1f6c810)) called at /usr/local/cpanel/Cpanel/TaskQueue.pm line 635
    eval {...} called at /usr/local/cpanel/Cpanel/TaskQueue.pm line 638
    Cpanel::TaskQueue::__ANON__() called at /usr/local/cpanel/Cpanel/StateFile.pm line 241
    eval {...} called at /usr/local/cpanel/Cpanel/StateFile.pm line 241
    Cpanel::StateFile::Guard::call_unlocked(Cpanel::StateFile::Guard=HASH(0x3d81728), CODE(0x3d8a7f0)) called at /usr/local/cpanel/Cpanel/TaskQueue.pm line 643
    Cpanel::TaskQueue::process_next_task(Cpanel::TaskQueue=HASH(0x2dfe308)) called at /usr/local/cpanel/bin/cpbackup_transporter line 168
    eval {...} called at /usr/local/cpanel/bin/cpbackup_transporter line 166
[2020-03-25 07:54:09 -0700] info [cpbackup_transporter] Upload attempt #2 starting for /backup/2020-03-25/accounts/account2.tar.gz to /dv02backups/2020-03-25/accounts/account2.tar.gz for destination:  Spaces backup storage
[2020-03-25 07:56:48 -0700] info [cpbackup_transporter] Successful transfer of /backup/2020-03-25/accounts/account2.tar.gz to /dv02backups/2020-03-25/accounts/account2.tar.gz for destination Spaces backup storage
[2020-03-25 07:56:48 -0700] info [cpbackup_transporter] The backup has been successfully uploaded at least once, now we will delete the local copy (/backup/2020-03-25/accounts/account2.tar.gz) since keeplocal (0) is disabled.
 

cPanelLauren

Forums Analyst II
Staff member
Nov 14, 2017
10,088
877
313
Houston
I see, in this example, the first upload attempt failed and the second was successful. You, unfortunately, can't increase the logging of the backup transporter, it echo's out the error code Amazon gives it which is 403 forbidden. The causes for this which would be relevant to this circumstance are:

AccessDenied
InvalidObjectStateThe operation is not valid for the current state of the object.
RequestTimeTooSkewed
The difference between the request time and the server's time is too large

Based on what occurred I'd assume this was a result of a timeout here:
Code:
[2020-03-25 07:20:57 -0700] info [cpbackup_transporter] Upload attempt #1 starting for /backup/2020-03-25/accounts/account2.tar.gz to /dv02backups/2020-03-25/accounts/account2.tar.gz for destination:  Spaces backup storage
[2020-03-25 07:54:09 -0700] warn [cpbackup_transporter] Upload attempt failed: Amazon::S3: Amazon responded with 403 Forbidden
That's a 34-minute gap between the initialization of the backup transport and its completion. What is set for the timeout at WHM>>Backup>>Backup Configuration for Maximum Destination Backup Timeout?
Also, you can enable logging on the S3 bucket I believe, per Enabling Logging Using the Console - Amazon Simple Storage Service
 

Nate Reist

Member
Jul 20, 2018
20
4
3
Michigan
cPanel Access Level
Root Administrator
That's a 34-minute gap between the initialization of the backup transport and its completion. What is set for the timeout at WHM>>Backup>>Backup Configuration for Maximum Destination Backup Timeout?
That is set to 4500 on both servers, so 75 minutes I assume?

Also, you can enable logging on the S3 bucket I believe, per Enabling Logging Using the Console - Amazon Simple Storage Service
I'll enable logging on the backup server and see what I get back from there. Once or twice I have seen RequestTimeTooSkewed and request time out messages., but they are sporadic as well