Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

speedemonV12

macrumors 6502
Original poster
Nov 29, 2005
319
0
Hello, I am looking for the best program to download an entire website to my HD, any suggestions?

I am also looking for a program that can go through specified pages of a website and download the images from those pages, any apps for this ?

Thanks !
 
The reason for download a whole website was that its a forum.. Im going on a trip, and dont know if i will have internet, so wanted to download the whole site to my computer.

The image part, the threads that I am browsing through have many images that people post on pages, and i want to download all the images that they post up, I am trying to put together all the icons that go in certain iphone themes. They are posted all over.
for example
http://macthemes2.net/forum/viewtopic.php?id=16783459
that thread has 52 pages, and images on all of em, I want those images, so If i cant find a way to get the images easily, I wanted to download the whole website, so when im bored, i can go through and manually save them all.

ok... so it seems that using firefox, and the down them all extension, i can save all the images from a certain page.. but i would still like to download the whole website, since i will be without internet for a while... any suggestions?


Also, NAG, how would i create an automator script to do that, just out of curiosity?
 
ok, so i have site sucker, and its working, but i noticed that it is saving all the linked sites that the main site has. Is there anyway i can stop that? I just want the main site, and all the images that are on that site.
 
ok, so i have site sucker, and its working, but i noticed that it is saving all the linked sites that the main site has. Is there anyway i can stop that? I just want the main site, and all the images that are on that site.

im not sure how to work it, but in site suckers preferences go to the "limit" tab

and it will be either the
"limit to level" setting or the
"limit to directory" setting. hope it helps.
 
Dont get that option in limit tab, heres what mine looks like



the site that im trying to download is

http://macthemes2.net/forum/

and im noticing that all the forum topics are .php pages, so when i open them from the site sucker folder, i just get the php code displayed in the browser. Anything i can do about that to view it like i would normally?
 
DownThemAll is an add-on for Firefox that can download all the images (or videos, or music or whatever) from a webpage. You'd probably have to go through it page by page though.
 
ya, i found that extension and it works wonderfully, but your right it would have to be page by page, which i dont mind doing. But i need the pages while im offline lol!
 
php pages

hi all,

The website I want to download is made up of php pages. Site Sucker can't handle that and I'm wondering if anyone knows of software that can?

Thanks

T
 
hi all,

The website I want to download is made up of php pages. Site Sucker can't handle that and I'm wondering if anyone knows of software that can?

Thanks

T

First off, this is completely possible using PHP itself.
Pretty sure it still bypasses the lame http://www.cyberciti.biz/tips/php-script-downloaded-as-source-code.html issue.

PHP:
$url = _GET['url'];;
if(isset($url)){

 if(!copy("http://www.weaktargetsite.com/targetScript.php", "nowMine.php"))
 {
 echo("failed to copy file");
 }

}

The logic here is nice, that's all I can say.
Lets take a look:
1. Create a new file named cross-domain-download.php in the place where you want downloads to end up on your PHP enabled web server (or other ...).
2.Get the url from a form or something and check if its set, if so,
3.if the file is not copied (which it should not be), display an error message.
4.But, the file DOES EXIST, so it is instead saved as nowMine.php copied to the same location where you ran the script you just created named cross-domain-download.php.
Why does this work, you ask?
Well we are not asking the server to copy it, but instead we're telling the system to just copy whatever $url is and if it cant just display fail message.
This solution may never set off the rule such as "# /etc/init.d/httpd stop" which could be added in server config that admins set for in order to prevent this sort of activity. However, I am assuming this is easy to fix with similar settings.
Just try it, if it doesn't work then I guess I am just one LUCKY mother, because every time I needed it, it seems to have worked wonders. My targets were not huge franchises either though. ;):apple:
The only reason I had this little script was because I needed to add it to a CMS application I was developing for a firm I use to work at. Hope this does at least what you need for now. Sorry if not, long post.:rolleyes:

-712011m4n
 
First off, this is completely possible using PHP itself.
Pretty sure it still bypasses the lame http://www.cyberciti.biz/tips/php-script-downloaded-as-source-code.html issue.

PHP:
$url = _GET['url'];;
if(isset($url)){

 if(!copy("http://www.weaktargetsite.com/targetScript.php", "nowMine.php"))
 {
 echo("failed to copy file");
 }

}

The logic here is nice, that's all I can say.
Lets take a look:
1. Create a new file named cross-domain-download.php in the place where you want downloads to end up on your PHP enabled web server (or other ...).
2.Get the url from a form or something and check if its set, if so,
3.if the file is not copied (which it should not be), display an error message.
4.But, the file DOES EXIST, so it is instead saved as nowMine.php copied to the same location where you ran the script you just created named cross-domain-download.php.
Why does this work, you ask?
Well we are not asking the server to copy it, but instead we're telling the system to just copy whatever $url is and if it cant just display fail message.
This solution may never set off the rule such as "# /etc/init.d/httpd stop" which could be added in server config that admins set for in order to prevent this sort of activity. However, I am assuming this is easy to fix with similar settings.
Just try it, if it doesn't work then I guess I am just one LUCKY mother, because every time I needed it, it seems to have worked wonders. My targets were not huge franchises either though. ;):apple:
The only reason I had this little script was because I needed to add it to a CMS application I was developing for a firm I use to work at. Hope this does at least what you need for now. Sorry if not, long post.:rolleyes:

-712011m4n

Since you specifically seek PHP to be saved as well, this may be a huge issue for you. Like everyone is stating, PHP parses server-side and wont except external php and such. Still, I believe that there is always a solution; its just a matter of application (and proper programming language ;D).

Do stuff like this, keep trying though, despite the known facts:
PHP:
if (file_exists($filename)) {
    echo "The file $filename exists";
} else {
    echo "The file $filename does not exist";
}
echo "<br/>";
if (is_readable($filename)) {
    echo 'The file is readable';
} else {
    echo 'The file is not readable';
}

$handle = @fopen($filename, "r");
if ($handle) {
    while (($buffer = fgets($handle, 4096)) !== false) {
        echo $buffer;
    }
    if (!feof($handle)) {
        echo "Error: unexpected fgets() fail\n";
    }
    fclose($handle);
}
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.