The Community Forums

Interact with an entire community of cPanel & WHM users!
  1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

FTP - Can't see file, but it is there?

Discussion in 'General Discussion' started by noimad1, Aug 18, 2005.

  1. noimad1

    noimad1 Well-Known Member

    Joined:
    Mar 27, 2003
    Messages:
    627
    Likes Received:
    0
    Trophy Points:
    16
    I have a user that has a 2GB file, and he can't see it through ftp (neither can I). But if I ssh to the server, the file is there for sure.

    I have made sure the file is owned by him, and even gave it 755 permissions, but we can not see the file at all.

    Anyone run accross this? I don't see why a size issue wouldn't let you see it, but could that be it?
     
  2. xerophyte

    xerophyte Well-Known Member

    Joined:
    Mar 16, 2003
    Messages:
    216
    Likes Received:
    0
    Trophy Points:
    16
    Location:
    Canada
    --with-largefile: support downloading of files larger than 2 gigabytes on
    32-bit architectures. Transfering so huge files through FTP is a strange
    idea. And your filesystem has to support it. Your kernel and your libc as
    well. And of course, the FTP client has to be safe against large files, too.
    And when this feature is enabled, downloads can be a bit slower (or more
    cpu-intensive) than without it, due to a limitation of actual Linux kernels.
    To summarize: don't enable this for fun, just if you are really planning to
    download files over 2 gigabytes.

    I would use rsync or scp to move the big files

    hope that helps
     
  3. xidica

    xidica Well-Known Member

    Joined:
    Apr 21, 2005
    Messages:
    63
    Likes Received:
    0
    Trophy Points:
    6
    Location:
    Texas
    yea the only way to do this via ftp is going to be to recompile the ftp daemon with the configure option for large file support...
     
  4. ramprage

    ramprage Well-Known Member

    Joined:
    Jul 21, 2002
    Messages:
    667
    Likes Received:
    0
    Trophy Points:
    16
    Location:
    Canada
    I just ran into this issue myself, I have a 9 gig tar.gz backup of a massive site on the server and it wasn't visible in FTP. I keep messing around with the pureftpd.conf options but that didn't help. I could see smaller files no problem.

    I guess what I need to do is get the large archive off Linux and onto my home Windows box. I was hoping FTP could do this but i guess not without a custom recompile. How hard is it to recompile the FTP daemon --with largefile support?
     
  5. ramprage

    ramprage Well-Known Member

    Joined:
    Jul 21, 2002
    Messages:
    667
    Likes Received:
    0
    Trophy Points:
    16
    Location:
    Canada
    Nevermind I found a better solution. I tried out a open source app called WinSCP which is awesome, I just drag and drop the file I want to my local desktop and it see's the 9+ gig backup.

    Cheers
     
  6. webignition

    webignition Well-Known Member

    Joined:
    Jan 22, 2005
    Messages:
    1,880
    Likes Received:
    0
    Trophy Points:
    36
    Yep, I use WinSCP as well and find it fantastic.

    It certainly makes the task of getting a file to your computer quite straightforward if you know what you're looking for and you can't be bothered to type the relevant SSH commands.

    I also find it's a good method for editting files, like httpd.conf, and find it more comfortable to use than pico or nano.
     
Loading...

Share This Page