Terminal, FTP, and Multiple Directories

Discussion in 'macOS' started by dpaanlka, Jun 4, 2007.

  1. dpaanlka macrumors 601

    dpaanlka

    Joined:
    Nov 16, 2004
    Location:
    Illinois
    #1
    Ok, I'm trying to get multiple files and directories via the command line. Basically, I want to get an entire directory that contains more directories and files.

    But when I type the get command, it gives me a bunch of errors that all the directories I'm trying to get do not already exist on my local machine. How can I work around this? There are a lot of directories and I don't want to manually create and get the contents of each one.
     
  2. plinden macrumors 68040

    plinden

    Joined:
    Apr 8, 2004
    #2
    Does recursive get work on the FTP server?
    Code:
    get -R xxx
    Edit: I should have said "does recursive get work with your client?" - recursive get is client-specific
     
  3. dpaanlka thread starter macrumors 601

    dpaanlka

    Joined:
    Nov 16, 2004
    Location:
    Illinois
  4. plinden macrumors 68040

    plinden

    Joined:
    Apr 8, 2004
    #4
    Ok, your client likely doesn't support get -r ... how about mget?
     
  5. dpaanlka thread starter macrumors 601

    dpaanlka

    Joined:
    Nov 16, 2004
    Location:
    Illinois
    #5
    I did mget and mget -R also with same effect. But I can download each file individually just fine?

    Argh!
     
  6. plinden macrumors 68040

    plinden

    Joined:
    Apr 8, 2004
    #6
    Are you using a Mac? (I don't want to make any assumptions) Try using the finder (open a Finder window, command-K, type "ftp://yourftpserver").

    You should be able to drag directories over.

    If you're using Linux, try wget -r ftp://...

    Edit: actually, on rereading your OP, I guess you are using a Mac, and using the Mac FTP client. I don't think I've ever actually using OS X's ftp command. But the Finder set up should work.
     
  7. dpaanlka thread starter macrumors 601

    dpaanlka

    Joined:
    Nov 16, 2004
    Location:
    Illinois
    #7
    Actually, I'm logged into a remote UNIX server via SSH and the Terminal, and I'm trying to get the entire contents of a directory from a second remote server via FTP to the first remote server that I am SSH-ing on.

    I don't want to waste my bandwidth downloading 7 gb of files, then re-uploading them, on my slow cable internet, when I can just have the two servers talk to each other all fast-like.

    I am on a Mac though.
     
  8. SC68Cal macrumors 68000

    Joined:
    Feb 23, 2006
    #8
    Your bet bet is to SSH and then ftp to the second server. I don't think CLI ftp has the ability to download directories. I've only had luck using mget with a wildcard (*) to just download the contents, making sure that I created the directories on my local machine.

    Your best bet, is to TAR up your directories on the remote machine before FTP'ing them to the host that your connected through SSH. That'll preserver directories and such
     
  9. Am3822 macrumors 6502

    Am3822

    Joined:
    Aug 16, 2006
    Location:
    Groningen, The Netherlands
    #9
    Can't you use scp to copy the files from server2 to server1?

    [in server1] scp -unsafe -r user@server2:path_server2/* path_server1/

    I think it should work.
     
  10. dpaanlka thread starter macrumors 601

    dpaanlka

    Joined:
    Nov 16, 2004
    Location:
    Illinois
    #10
    I sent a description to Dreamhost to see if they had any ideas. Amazingly, at 3:45 in the morning they still have people answering support emails. They suggested I try using a wget -mirror command, which I did, and that seems to be taking care of it.

    I still wonder why FTP wouldn't work.

    And now, for a moment of zen...

    zen.gif
     
  11. SC68Cal macrumors 68000

    Joined:
    Feb 23, 2006
    #11
    I was thinking about wget, but I didn't know if the files you were looking for were accessible to the public or if it was a private directory
     

Share This Page