Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

one1

macrumors 65816
Original poster
Jun 17, 2007
1,176
29
Chattanooga, TN
There is a website that has a couple thousand cool old car photos I want to save and keep as rotating wallpaper. The url is conveniently labeled 1 through 2345. So all I need is to command an app or script to save the pictures from url.test/1.jpg through url.test2345.jpg.

Does such a harvester exist? Could I make an rss aggregator do this somehow?

Thanks :)
 
I attempted to write a shell script for you
Code:
#!/bin/bash
#change the url to the correct site and directory.
#make sure the file has execute permissions, can be changed in "get info"
#this will take as long as it needs to download all the files
#the files will be located in your home directory
#i am not responsible for any dammage this may cause
COUNT=1
while [ $COUNT -lt 2346 ]; do
        curl -O http://www.putsitehere.com/$COUNT.jpg
        let COUNT=COUNT+1
done

You would save this as whatever.sh and execute it in the terminal. Someone else should look it over first as I don't know if this is totally correct.
 
There is a website that has a couple thousand cool old car photos I want to save and keep as rotating wallpaper. The url is conveniently labeled 1 through 2345. So all I need is to command an app or script to save the pictures from url.test/1.jpg through url.test2345.jpg.

Does such a harvester exist? Could I make an rss aggregator do this somehow?

Thanks :)

This app should be able to download all the images from a site:
SiteSucker

Or this Firefox plug-in might do it:
DownThemAll
 
Thanks Dew, I can't get those to work with a sequence of files.

I attempted to write a shell script for you
Code:
#!/bin/bash
#change the url to the correct site and directory.
#make sure the file has execute permissions, can be changed in "get info"
#this will take as long as it needs to download all the files
#the files will be located in your home directory
#i am not responsible for any dammage this may cause
COUNT=1
while [ $COUNT -lt 2346 ]; do
        curl -O http://www.putsitehere.com/$COUNT.jpg
        let COUNT=COUNT+1
done

You would save this as whatever.sh and execute it in the terminal. Someone else should look it over first as I don't know if this is totally correct.
Thank you.

That appears to be this: http://linux.byexamples.com/archive...es-from-a-site-using-for-loop-like-c-in-bash/

I just need some guidance on using it.
 
1) Open terminal

2)type the command "nano downloadImages.sh" without the quotes.

3)Paste in the script

4)Change the url to the correct one, for example if the images are at example.com/images/1.jpg you want it to be example.com/images/$COUNT.jpg

5)Change 2346 to however many images there are +1

6)press ctrl+o then enter to save the file

7)press ctrl+x then enter to exit nano

8)in terminal type "chmod +x downloadImages.sh" without quotes

9)in terminal type "./downloadImages.sh" without quotes

You might want to try a test run of just 50 image to see how long it will take. All the images will be stored in your home directory.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.