Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Redjericho

macrumors 6502a
Original poster
Sep 16, 2011
815
0
Is there an app that does this? There seem to be only apps that save single pages, which safari can do already.
 
He/she wants to save the entire website. Every page of the site. AFAIK there isn't anything out there that will let you do this.

Well, by providing a rule to both apps that lets for any kind of recursion depth, both are capable of this - after all, wget (and the like) is an industry-standard full site getter for exactly this.

Of course, if non-linked resources need to be saved, no HTTP should be used to access the site but FTP (assuming he knows the password).
 
Well, by providing a rule to both apps that lets for any kind of recursion depth, both are capable of this - after all, wget (and the like) is an industry-standard full site getter for exactly this.

Of course, if non-linked resources need to be saved, no HTTP should be used to access the site but FTP (assuming he knows the password).

Oh! See...one learns something new everyday. Does this work with also sites that are driven by database queries etc? Or is this more for static sites?
 
Time backoker has an archived history of websites since 1996. I know its not exactly what you are looking.
 
Oh! See...one learns something new everyday. Does this work with also sites that are driven by database queries etc? Or is this more for static sites?

Static sites IIRC. Albeit, it could be technically very easy to do a dynamic version too (with a pre-defined dictionary of queries). wget may already support it? DUnno.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.