Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Craigy

macrumors 6502
Original poster
Jan 14, 2003
403
48
New Zealand
Hi - does anyone know of any mac software that will enable be to create a local copy of a web site for offline browsing?

Thanks
 
Safari, Command S (as a web archive) should work...

Doesn't this just work for the current page only? Or can you download an entire site? (I'm a FireFox user!)

I think the Safari option is just for single pages, where sitesucker/webdevil/etc can download an enitre site, and re-link it for offline browsing.
 
wget and cURL come into mind, although I'm not sure if either of them can copy an entire site. Remember that (security vulnerabilities aside) you should never be able to download the code behind the site (like the PHP which is run before the site is served to you) or the raw contents of the database, so things like search boxes, most logins, etc won't work even if you manually downloaded every single file. Thus you can't save Google and get the entire internets ;)
 
wget would definitely give you what you want (if you just want html/css/images.) I don't believe wget is available on a base Tiger or Leopard install, so you'll likely need to build it on your own, or use macports/fink to get a copy. Anyway, wget will fetch recursively if you tell it to. My only suggestion is that you use the --wait option, to specify how long you wait between requests, so that you don't hammer any sites with requests.

Use it like this (from the terminal, obviously):

wget --wait=5 -r -p http://url.tld
 
I tried some application but each one has its own problems.
Is there a good application to do the job well?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.