Ripping a local copy of a website for offline browsing

Discussion in 'Web Design and Development' started by Craigy, May 25, 2008.

  1. macrumors 6502

    Joined:
    Jan 14, 2003
    Location:
    New Zealand
    #1
    Hi - does anyone know of any mac software that will enable be to create a local copy of a web site for offline browsing?

    Thanks
     
  2. macrumors 6502a

    vandozza

    Joined:
    Jun 14, 2006
    Location:
    Australia
    #2
  3. macrumors G4

    Eraserhead

    Joined:
    Nov 3, 2005
    Location:
    UK
    #3
    Safari, Command S (as a web archive) should work...
     
  4. macrumors 6502a

    vandozza

    Joined:
    Jun 14, 2006
    Location:
    Australia
    #4
    Doesn't this just work for the current page only? Or can you download an entire site? (I'm a FireFox user!)

    I think the Safari option is just for single pages, where sitesucker/webdevil/etc can download an enitre site, and re-link it for offline browsing.
     
  5. macrumors member

    Joined:
    Jan 14, 2008
    #5
    wget and cURL come into mind, although I'm not sure if either of them can copy an entire site. Remember that (security vulnerabilities aside) you should never be able to download the code behind the site (like the PHP which is run before the site is served to you) or the raw contents of the database, so things like search boxes, most logins, etc won't work even if you manually downloaded every single file. Thus you can't save Google and get the entire internets ;)
     
  6. macrumors member

    Luveno

    Joined:
    May 12, 2006
    Location:
    Nova Scotia, Canada
    #6
    wget would definitely give you what you want (if you just want html/css/images.) I don't believe wget is available on a base Tiger or Leopard install, so you'll likely need to build it on your own, or use macports/fink to get a copy. Anyway, wget will fetch recursively if you tell it to. My only suggestion is that you use the --wait option, to specify how long you wait between requests, so that you don't hammer any sites with requests.

    Use it like this (from the terminal, obviously):

    wget --wait=5 -r -p http://url.tld
     
  7. macrumors 65816

    Joined:
    Sep 3, 2006
    #7
    I tried some application but each one has its own problems.
    Is there a good application to do the job well?
     

Share This Page