URL Loading System to Request Contents of a Directory?

Discussion in 'Mac Programming' started by Spike099, May 6, 2009.

  1. Spike099 macrumors regular

    Joined:
    Feb 18, 2007
    Location:
    Canada
    #1
    So, a little history about why this has stricken my curiosity.

    I had a client who wanted to transfer all his geocity files over to a real provider. The client had thousands of files on his account. Now, geocities doesn't allow you to do bulk transfers, nor can you access the site via ftp on a basic account. So... Basically the only way for a typical person to do this was to download each file individually! This wasn't going to happen.

    My resolution was to write an app that took in a csv list and downloaded each file for you.

    This worked, however I had to copy and paste the list of files and format it properly.

    My question is, is there a way to request the contents of a directory via URL Loading System or some other Cocoa or foundation framework?
     
  2. kainjow Moderator emeritus

    kainjow

    Joined:
    Jun 15, 2000
    #2
    If you're referring to a directory on the server, no, not unless you go through some protocol like FTP or WebDAV.
     
  3. Spike099 thread starter macrumors regular

    Joined:
    Feb 18, 2007
    Location:
    Canada
    #3
    Yes, I was referring to a directory on the server.

    So, a crawler would start by parsing whatever index (if no robot.txt) was provided and search the file for links, is this how they find all the pages available on a webserver?
     
  4. xyzeugene macrumors newbie

    Joined:
    Feb 13, 2009
    #4
    Dont reinvent the wheel...

    Use http://www.httrack.com

    its free and you and slow it down if need be.



     
  5. Consultant macrumors G5

    Consultant

    Joined:
    Jun 27, 2007
    #5
    Not sure about that one but there are plenty of apps that will download all linked files from a server.
     

Share This Page