Jump to content

Help with Web Grabber?


Machina
 Share

Recommended Posts

Does anyone know of a program that can download every available page under a given domain and tuck it all up in a neat little folder? When saving an individual page, I get all the pictures and code needed to replicate the page when offline, but I'm looking for a way to do that writ large, like downloading every page in Wikipedia or something like that.

The plain explanation is that I'm thinking about archiving. This site, specifically. You know. Just in case.

Also, some day I mean to pore over it all and actually read everything.

Link to comment
Share on other sites

A) If you look, there are already archives of Wikipedia for download (they're very handy, really).

B) If you don't actually want Wikipedia, Google for: Website mirroring freeware

There were several promising links a couple down from the top, but they're blocked from work (Apparrently, I can no longer access free-software sites. Odd.)

Link to comment
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...