s3 and large backup files fail to put

briansol

Well-Known Member
Oct 31, 2007
46
2
58
ct
I was told this would work...
Amazon S3 integrated into WHM as a backup option | cPanel Feature Requests

it doesn't.
/backup/cpbackup/daily/MYSITE.tar.gz

Warning: S3::putBucket(blahblahblah, private, ): [BucketAlreadyOwnedByYou] Your previous request to create the named bucket succeeded and you already own it. in /root/userishere/S3.php on line 222

Warning: S3::putObject(): [EntityTooLarge] Your proposed upload exceeds the maximum allowed size in /root/userishere/S3.php on line 222
The file is about 40gb.


How do I enable the large protocol support?


My other small sites all went over fine.
 
Last edited by a moderator:

cPanelMichael

Administrator
Staff member
Apr 11, 2011
47,880
2,268
463
Hello :)

Could you elaborate on the contents of the "/root/whmcpb" directory that you referenced? Is this a third-party backup utility?

Thank you.
 

briansol

Well-Known Member
Oct 31, 2007
46
2
58
ct
Hrm, I had assumed that was part of the new back up system.

I HAD a 3rd party app for the legacy backup (which failed going to s3 for this exact size reason), but I don't know why that is firing at all, considering I've disabled the legacy backup system, enabled the new backup system, and of course adjusted the user-level account backup option panel.

I do show file system objects in the correct NEW folder system and on s3... so the orig question is resolved.

Which now begs, why that log ever even ran or fired? It looks like no new legacy backups were made since the 13th which implies the 14th (date of OP) was the log of the first new backup with s3.

interesting....


I got no email last night. Looking at my settings, I didn't have it set to back up on sunday for some reason.

I'll follow up tomorrow with what comes tonight.
 

Attachments

briansol

Well-Known Member
Oct 31, 2007
46
2
58
ct
So, I got the same log again last night.
/backup/cpbackup/daily/SITE.tar.gz

Warning: S3::putBucket(BUCKET, private, ): [BucketAlreadyOwnedByYou] Your previous request to create the named bucket succeeded and you already own it. in /root/whmcpb/S3.php on line 222

Warning: S3::putObject(): [EntityTooLarge] Your proposed upload exceeds the maximum allowed size in /root/whmcpb/S3.php on line 222

so, This still begs the question as to why its firing legacy app addons? Yes, this script appears to be the addon that I installed a while back. cPanel backup solution, automate backup to remote ftp server and Amazon S3 cloud service

Here's my set up:

legacy is off, and I disabled the check box as well for postCP backup include. (see the install.sh shot)
 

Attachments

briansol

Well-Known Member
Oct 31, 2007
46
2
58
ct
Howdy,

This may be related to internal case 93845. (S3 backups fail if they take longer than 15 minutes)

Can you please submit a ticket so we can investigate this issue?

Thanks,
missed this.

It looks like the first 3 days, the large file is there.
last night, it is not.

The log indicates it is using the new format:
Code:
pkgacct using '/usr/bin/gzip -6' to compress archives
pkgacct working dir : /backup/2014-03-18/accounts/ACCOUNTNAME
moving 40gigs must take some time, though.

I'm not sure that it is there YET but might be on its way over (or the pieces are still being packaged on s3 side).
Does the log wait for success of the put to s3, or just cares that it started to send it?
 

cPanelMichael

Administrator
Staff member
Apr 11, 2011
47,880
2,268
463
To update, it was determined this issue was related to internal case 93845. Per this case, when using the Amazon S3 destination in the new backup system, if the size of the archive is around 2.9 G and over, the transport starts but times out after 15 minutes. This case is still under investigation and there is currently no exact time frame available on if/when a resolution might be implemented.

Thank you.
 

briansol

Well-Known Member
Oct 31, 2007
46
2
58
ct
to REALLY follow up.... lol

Turns out there's a couple things going on

1- I have a legacy 3rd party app hitting a hook. need to delete those files
2- the s3 DOES in fact go, but it takes like 18 hours to actually get there fully, so while I get the email that my back up is done, I don't necessarily see it on s3 yet... which is what sparked the issue.
3- There's some internal movement to break up some local vs s3 rules for movement as well as separate logs. hope fully we see something in a future build.

All in all, this DOES work for large files (mine is 40gb). It just takes a day to get there... Not cpanel's fault, that's uplink speed (well, actually, its' s3 being slow that's the bottle neck).
 

cPanelPeter

Senior Technical Analyst
Staff member
Sep 23, 2013
586
25
153
cPanel Access Level
Root Administrator
Hello,

It should be noted that the backup is a separate process from the transfer to S3 (or any remote location). The backup is done first and you will be alerted when the backup is completed. The transfer is started after each successful backup is completed, so you may get alerts that the backup is completed, while the transfer is still in progress.