I use it on linux.
You dump the website to your hard disk (using wget with a switch), then sed, awk or homegrown script your way to provide a nice clean text file with a detailed directory map of the website. If that's what you're after. But you've essentially cloned the website to your machine, and as such can extract whatever data you want...