Downloading a collection of images from a website, what language?

Discussion in 'Mac Programming' started by hdawg87, Nov 3, 2008.

  1. hdawg87 macrumors newbie

    Joined:
    Nov 3, 2008
    #1
    Hi everybody,

    I'm trying to write an application that will go to a URL of an image and download it into a folder on my hard drive with a specific filename. This will be done over 1000 times with different URL's. My problem is that I will need the url's to be stored in some kind of array and called from that. What language would I be able to do this the easiest? I'm not worried about developing a GUI or anything for this right now. Would applescript be able to accomplish this adequately?

    P.S I'm running on a G5 dual 2.0ghz 3gb ram, tiger

    Thanks in advance
     
  2. lee1210 macrumors 68040

    lee1210

    Joined:
    Jan 10, 2005
    Location:
    Dallas, TX
    #2
    shell, 100%. I do this every time i find a new web comic and want to grab the whole archive. There's generally a need to generate dates, and out of laziness i tend to do this with C or something.

    Otherwise, it's all curl/wget, to grab the page itself, then some awk/grep to parse out the image URL, then another wget to grab the image.

    I'm sure it's possible to do this in any number of ways, but I think it would be much more complex.

    -Lee
     
  3. savar macrumors 68000

    savar

    Joined:
    Jun 6, 2003
    Location:
    District of Columbia
    #3
    sounds like a job for perl
     
  4. ChrisA macrumors G4

    Joined:
    Jan 5, 2006
    Location:
    Redondo Beach, California
    #4
    Write it in anything that is able to call "wget". Pick the script language that you know best.
     
  5. yeroen macrumors 6502a

    yeroen

    Joined:
    Mar 8, 2007
    Location:
    Cambridge, MA
    #5
    CPAN almost surely has some canned web scraping utility for this sort of thing.
     

Share This Page