Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

dropadrop

macrumors member
Original poster
Sep 2, 2006
72
27
I'm trying to use site sucker to make backups of a few sites. I'm not however managing to get exclusions to work, with or without regxp.

For example when downloading a forum I end up with bucketloads of search pages. I've tried to exclude url/forum/search.php, url/forum/search*, url/forum/search.*, forum/search+, forum/search.+ but nothing works. I still end up with loads of search-xyz pages.

Anyone have a clue what could be going wrong?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.