The Community Forums

Interact with an entire community of cPanel & WHM users!
  1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Backup to Amazon S3 Methods

Discussion in 'Data Protection' started by chrisando, Dec 25, 2012.

  1. chrisando

    chrisando Registered

    Joined:
    Dec 25, 2012
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    1
    cPanel Access Level:
    Root Administrator
    I have a VPS. and need to do remote backups to amazon s3

    These are the options i see available to me

    1. Use cpanel backup to create local backup; and then use s3cmd to transfer the /backup folder to amazon s3 bucket.
    This would require double the disk space of my server to create backups.

    2. Use cpanel backup to create backups directly to a amazon s3 bucket which i have mounted with fuse/s3fs.
    This gives me Input/output error (5) in the backup log.
    Has anyone successfully made this work? This is ideal as it saves me disk space.

    3. Directly copy the root directory of the VPS to a S3 bucket.
    This wouldnt require disk space. But would make recovery a little harder (but i understand it can be done fairly easy?)


    Ideally i want to get #2 to work, and transfer the backup directly to S3 without needing extra disc space locally.
    Has anyone got this to work and can shed some light on how it is done?

    Thanks
     
  2. Travis

    Travis Active Member
    Staff Member

    Joined:
    Apr 24, 2002
    Messages:
    28
    Likes Received:
    1
    Trophy Points:
    1
    Location:
    cPanel Main Office
    cPanel Access Level:
    Root Administrator
  3. chrisando

    chrisando Registered

    Joined:
    Dec 25, 2012
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    1
    cPanel Access Level:
    Root Administrator
    Have definitely 'liked' the feature.

    In the meantime. Does anybody have any suggestions on how i can take quality backups without using extra disk space on the server.
     
  4. Infopro

    Infopro cPanel Sr. Product Evangelist
    Staff Member

    Joined:
    May 20, 2003
    Messages:
    14,468
    Likes Received:
    196
    Trophy Points:
    63
    Location:
    Pennsylvania
    cPanel Access Level:
    Root Administrator
    Twitter:
    Add a second drive to the server. :)
     
  5. Astral God

    Astral God Well-Known Member

    Joined:
    Sep 27, 2010
    Messages:
    180
    Likes Received:
    0
    Trophy Points:
    16
    Location:
    127.0.0.1
    cPanel Access Level:
    Root Administrator
    Have you tried s3fuse in place of s3fs ?

    Pay attention, you'll need to compile fuse from source (not from yum).
     
  6. briansol

    briansol Active Member

    Joined:
    Oct 31, 2007
    Messages:
    35
    Likes Received:
    1
    Trophy Points:
    6
    Location:
    ct
    I bought a script to do MOST of this from a 3rd party.
    /http://wiki.nicecoder.com/display/CPAN/Installation+for+WHM


    it works fine for small files (up to amazon put limit of 5gb) but it fails on my large files of my big sites (15+ gb archives) because s3 won't take it.
    /http://aws.amazon.com/s3/faqs/#How_much_data_can_I_store

    There's an api with info on the api:
    /http://docs.aws.amazon.com/AmazonS3/latest/dev/uploadobjusingmpu.html


    Warning: S3::putObject(): [EntityTooLarge] Your proposed upload exceeds the maximum allowed size in /root/whmcpb/S3.php on line 222
    /backup/cpbackup/daily/xxxxxxx.tar.gz


    I have opened a dialog with the developer of the script, but so far no results. basically, there's an api on amazon to do a split and re-join all through their service. It's far more complicated than just putting the file.


    In the mean time, i just manually every now and then re-tar into 2.5gb archives and manually scmd the part_X files up to s3.

    it's not the greatest, but it's the best thing available right now.

    hoping the raid 1 holds up for me and i can use a backup on /home if i lose a drive. but if not, at least i'll have MOST of my site on s3.






    In the meantime, does anyone know if there's a way to split the capnel archive automatically into chunks? is there a --split flag that can be added on to the backup task some how?
     
    #6 briansol, Jan 15, 2013
    Last edited: Jan 15, 2013
  7. Infopro

    Infopro cPanel Sr. Product Evangelist
    Staff Member

    Joined:
    May 20, 2003
    Messages:
    14,468
    Likes Received:
    196
    Trophy Points:
    63
    Location:
    Pennsylvania
    cPanel Access Level:
    Root Administrator
    Twitter:
Loading...

Share This Page