I would like to download all images in a web page. The tool should be smart enough to examine the css and javascript files in the page source to look for the images.
Ideally, it should also replicate the folder hierarchy, saving the images in the correct folder. For example, the web page may have some images for menu items stored in images/menu/ and for background images it may be stored in images/bg/.
Is there such a tool that you know of? (preferably in Windows but Linux is still ok)
Many thanks to you all.
Answer
wGet for windows can recursively download a site. It should keep your folder structure. You may need to delete the html files after getting everything, but IMO, it's very easy to use.
http://gnuwin32.sourceforge.net/packages/wget.htm
use the "-r" flag to recursively download a site. e.g. wget -r http://example.com
Here's a brief tutorial on site downloading.
http://linuxreviews.org/quicktips/wget/
No comments:
Post a Comment