wget and cURL come into mind, although I'm not sure if either of them can copy an entire site. Remember that (security vulnerabilities aside) you should never be able to download the code behind the site (like the PHP which is run before the site is served to you) or the raw contents of the database, so things like search boxes, most logins, etc won't work even if you manually downloaded every single file. Thus you can't save Google and get the entire internets
wget would definitely give you what you want (if you just want html/css/images.) I don't believe wget is available on a base Tiger or Leopard install, so you'll likely need to build it on your own, or use macports/fink to get a copy. Anyway, wget will fetch recursively if you tell it to. My only suggestion is that you use the --wait option, to specify how long you wait between requests, so that you don't hammer any sites with requests.