Go Back   MacRumors Forums > Special Interests > Visual Media > Web Design and Development

Reply
 
Thread Tools Search this Thread Display Modes
Old May 25, 2008, 07:33 AM   #1
Craigy
macrumors 6502
 
Join Date: Jan 2003
Location: New Zealand
Ripping a local copy of a website for offline browsing

Hi - does anyone know of any mac software that will enable be to create a local copy of a web site for offline browsing?

Thanks
Craigy is offline   0 Reply With Quote
Old May 25, 2008, 08:25 AM   #2
vandozza
macrumors 6502a
 
vandozza's Avatar
 
Join Date: Jun 2006
Location: Australia
http://www.sitesucker.us/home.html

try this, i have used it in the past, and should do what you require.
vandozza is offline   2 Reply With Quote
Old May 25, 2008, 09:51 AM   #3
Eraserhead
macrumors G4
 
Eraserhead's Avatar
 
Join Date: Nov 2005
Location: UK
Safari, Command S (as a web archive) should work...
__________________
Actually it does make sense. Man created god, so if we exist, He exists. - obeygiant
Eraserhead is offline   0 Reply With Quote
Old May 25, 2008, 09:21 PM   #4
vandozza
macrumors 6502a
 
vandozza's Avatar
 
Join Date: Jun 2006
Location: Australia
Quote:
Originally Posted by Eraserhead View Post
Safari, Command S (as a web archive) should work...
Doesn't this just work for the current page only? Or can you download an entire site? (I'm a FireFox user!)

I think the Safari option is just for single pages, where sitesucker/webdevil/etc can download an enitre site, and re-link it for offline browsing.
vandozza is offline   0 Reply With Quote
Old Jun 2, 2008, 12:40 AM   #5
anubis26
macrumors member
 
Join Date: Jan 2008
wget and cURL come into mind, although I'm not sure if either of them can copy an entire site. Remember that (security vulnerabilities aside) you should never be able to download the code behind the site (like the PHP which is run before the site is served to you) or the raw contents of the database, so things like search boxes, most logins, etc won't work even if you manually downloaded every single file. Thus you can't save Google and get the entire internets
anubis26 is offline   0 Reply With Quote
Old Jun 4, 2008, 08:57 AM   #6
Luveno
macrumors member
 
Join Date: May 2006
Location: Nova Scotia, Canada
Send a message via AIM to Luveno Send a message via Skype™ to Luveno
wget would definitely give you what you want (if you just want html/css/images.) I don't believe wget is available on a base Tiger or Leopard install, so you'll likely need to build it on your own, or use macports/fink to get a copy. Anyway, wget will fetch recursively if you tell it to. My only suggestion is that you use the --wait option, to specify how long you wait between requests, so that you don't hammer any sites with requests.

Use it like this (from the terminal, obviously):

wget --wait=5 -r -p http://url.tld
__________________
I never contradict myself, except for when I do.
Luveno is offline   0 Reply With Quote
Old Apr 12, 2014, 08:46 AM   #7
cool11
macrumors 65816
 
Join Date: Sep 2006
I tried some application but each one has its own problems.
Is there a good application to do the job well?
cool11 is offline   0 Reply With Quote

Reply
MacRumors Forums > Special Interests > Visual Media > Web Design and Development

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Similar Threads
thread Thread Starter Forum Replies Last Post
Saving a website to view it offline? MacNoobGuy Mac Basics and Help 14 Dec 25, 2013 06:01 AM
How to copy local Notes to iCloud? OSMac iPad 1 Oct 24, 2012 03:40 PM
Save an entire website for offline viewing? Redjericho iPad Apps 8 Sep 12, 2012 10:04 AM
Offline Web Browsing Help zcamann iPhone Tips, Help and Troubleshooting 0 Aug 16, 2012 06:38 PM
Anyone else have a problem browsing the three (UK) website with safari? afd iPhone 2 Aug 6, 2012 03:31 PM

Forum Jump

All times are GMT -5. The time now is 04:59 AM.

Mac Rumors | Mac | iPhone | iPhone Game Reviews | iPhone Apps

Mobile Version | Fixed | Fluid | Fluid HD
Copyright 2002-2013, MacRumors.com, LLC