Web site download options

Have a work website with a section that has links to the latest evidence. It’s quite an effort to keep going through all the links and I was hoping to try and find some automated system of getting the sections and converting them to pdfs and then saving them for off line review on my iPad.

Does anyone know of any software that can download sections from a website?

Main page
Section to information
Within that there are subsections if interest
Within that is the data I want

Software
Hazel
Other options greatly appreciated

You can convert website pages to pdfs several levels deep … with Adobe Acrobat. If you need it as a one-off, Adobe will let you subscribe for one month for $23.

On my Mac I use SiteSucker, a great $4.99 utility in the Mac App Store, to download sites and parts of sites, but it does so by polling all branched URLs and duplicating the structure of the site (or part of site you’re sucking), including all images, pdfs and stylesheets. It functions essentially by being a wrapper on top of a UNIX command - either cURL or wget, I forget which - but it’s therefore something you could do for free. It will grab an entire site for offline browsing, if you’d like, and you don’t need to do anything else, but it will not create pdfs of each URL, which is what you’re looking for. Still, a very useful app and functionality.

For free, perhaps with Automator and Keyboard Maestro you could use wget via command line to auto-save pages as pdfs once downloaded (and then delete the site pages afterwards), but that’s beyond my abilities.

You may also use httrack (https://www.httrack.com/page/2/) which works like wget but with a graphical interface (some say gui).