H@L0_F00 Posted April 23, 2010 Share Posted April 23, 2010 There have been a few times I have wanted to do this. I have searched and searched, but what I've found doesn't do what I want/need. Maybe somebody here will understand and know what program or script I could possibly use to accomplish this... Suppose I have a website (http://www.nirsoft.net/utils/) that has a bunch of things I would like to download. Suppose the page actually only has links to subdirectories though (http://www.nirsoft.net/utils/blue_screen_view.html, etc) and it's the subdirectories that contain the link to the download I want. Now suppose I can make it even easier by saying essentially, "Follow HTML links up to 2 pages deep and download (or even just list) just the ZIP files." Does anybody know of a program or script that can do such a thing? Thanks. Quote Link to comment Share on other sites More sharing options...
digip Posted April 24, 2010 Share Posted April 24, 2010 WGET is what you want. It can spider pages, download certain file types you specify or clone all followable links, images, exe's, txt, etc. It will require some command line fu, but their help file is pretty easy to understand. I've posted a few times my script for downloading from RSS feeds using a windows bat scriot, some vbs and wget, but if you have a look at the code you can rewrite it to do what you want. Search the forums, I dont have the link handy... Quote Link to comment Share on other sites More sharing options...
Infiltrator Posted April 25, 2010 Share Posted April 25, 2010 There have been a few times I have wanted to do this. I have searched and searched, but what I've found doesn't do what I want/need. Maybe somebody here will understand and know what program or script I could possibly use to accomplish this... Suppose I have a website (http://www.nirsoft.net/utils/) that has a bunch of things I would like to download. Suppose the page actually only has links to subdirectories though (http://www.nirsoft.net/utils/blue_screen_view.html, etc) and it's the subdirectories that contain the link to the download I want. Now suppose I can make it even easier by saying essentially, "Follow HTML links up to 2 pages deep and download (or even just list) just the ZIP files." Does anybody know of a program or script that can do such a thing? Thanks. I think you will find this web link very useful http://www.iwebtool.com/link_extractor Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.