Section: (pj)
Updated: 2021-10-19
Index Return to Main Contents

httrack is a great program for grabbing a local copy of whole websites.

To grab a whole domain, including photos and other files, and non-html files from other domains, do this:

httrack -n '*'

The -n gets non-html files. The filter at the end says the get all files located at the domain.

Here is a real life example:

httrack -n '*'

You can use -rN to limit the depth to N links. To get a single page and its media, do something like this:

httrack -n '*' -r1

To get all the pages linked off that page (as if it were a table of contents), just use -r2:

httrack -n '*' -r2

If you're copying a blogspot blog, you probably need some more filters: `' ''  


Paul A. Jungwirth.




This document was created by man2html, using the manual pages.
Time: 16:45:10 GMT, April 11, 2022