Machina Posted September 18, 2006 Share Posted September 18, 2006 Does anyone know of a program that can download every available page under a given domain and tuck it all up in a neat little folder? When saving an individual page, I get all the pictures and code needed to replicate the page when offline, but I'm looking for a way to do that writ large, like downloading every page in Wikipedia or something like that.The plain explanation is that I'm thinking about archiving. This site, specifically. You know. Just in case.Also, some day I mean to pore over it all and actually read everything. Link to comment Share on other sites More sharing options...
kestrel404 Posted September 18, 2006 Share Posted September 18, 2006 A) If you look, there are already archives of Wikipedia for download (they're very handy, really). If you don't actually want Wikipedia, Google for: Website mirroring freewareThere were several promising links a couple down from the top, but they're blocked from work (Apparrently, I can no longer access free-software sites. Odd.) Link to comment Share on other sites More sharing options...
Machina Posted September 19, 2006 Author Share Posted September 19, 2006 Actually, the site I was referring to was N Prime. Thanks for the second option, though, Alchemist, I'll give it a go. Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.