Download All Linked Files From a Web Page
Here is a simple method to download all files linked from a web page:
wget -r -np -l 6 -nd -A zip 'https://example.com/downloads'
-r
– Recurse subdirectories (follow links)-np
– No Parent (do not load any pages from a parent level)-nd
– No directories (all fetched files will be in the same folder)-l 6
– Do not go deeper than 6 levels of link recursion-A zip
– Specify extensions (in this case, only.zip
files)
It should go without saying but be wary of how this is used. Information may be public, but your access might get blocked, and scraping websites without express permission is in dubious legal / ethical territory.