Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

johnsmithson

macrumors newbie
Original poster
Jan 5, 2011
2
0
I have creating a simple workflow in Automator which will extract text from a list of URLs and save them into one text file.

The actions are:

Get Specified Text
Filter Paragraphs (return paragraphs that begins with http://)
Get Text from Webpage
New Text File

Whenever there is a problem with one of the URLs, Automator throws an error and stops working - is it possible to make it ignore the problem URL and continue?
 

Attachments

  • URL Scraper.zip
    55.2 KB · Views: 148
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.