Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

0098386

Suspended
Original poster
Jan 18, 2005
21,574
2,908
I'm going on holiday in a couple of months with no internet connection for 2 weeks. It's all good except I need access to a wiki. Not a giant wikipedia-like monster, but is it possible to download a smallish (probably few hundred pages) static, offline one?
I did a little googling around but I couldn't find anything.

I ask because a few years ago a guy at college did a similar thing for a bunch of wikipedia articles so I imagine the apps (or whatever) to do this have only got better.
 
I've never done it, but a coworker has used the "wget" command in Linux to accomplish this. Obviously, it works better with websites that are static in nature with links that don't lead to dynamically generated content. Still, I wouldn't be surprised if there are tools that can handle dynamic content.

I'm not sure what the Mac equivalent is, but that command might work too.

Of course, Safari has a "Save As" command under the File menu which will save as a web archive. So you could do it manually if you know the actual pages you want to save. The problem with this approach is that each web page is stand-alone.
 
Thanks all! Blue Crab Lite sounds like it will do the job just fine.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.