Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

max2

macrumors 603
Original poster
May 31, 2015
6,446
2,051
I want to try to see if my website downloader works.
 
I was going to suggest example.com:
Code:
curl -s http://www.example.com | wc
      50     120    1270

curl -s http://www.google.com | wc
       5     313   12292

Then I realized the example.com page links to iana.org, so if your "website downloader" follows HTML links, that could get ugly.

One simple solution is to make your own test "website". Put a couple HTML pages on an Amazon S3 bucket, with as many or as few links in them as you want. Then point the downloader at the root URL of your bucket resources (make them public, i.e. no permission required; see Mac App Store for S3 utility apps). You could also enable some of the S3 logging and reporting, so you can see all the GET requests, or counts of them. Given the low cost of S3, this might cost a few pennies. Except I'm not sure they'll bill you for amounts that low, so it could well be a cost of $0.00.
 
  • Like
Reactions: D.T.
Or make your own site on your local computer, start a server, and then point your downloader to it.

If you haven't built a Ruby On Rails app yet, you could do it in less time than it took for the above replies to get posted to this thread.
 
Or make your own site on your local computer, start a server, and then point your downloader to it.

If you haven't built a Ruby On Rails app yet, you could do it in less time than it took for the above replies to get posted to this thread.

I think you're being a _touch_ optimistic :D
 
  • Like
Reactions: BarracksSi
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.