"Download all linked files from a web page"

linux command line

Here is a simple method to download all files linked from a web page:

wget -r -np -l 6 -nd -A zip 'https://example.com/downloads'

It should go without saying but be wary of how this is used. Information may be public, but your access might get blocked, and scraping websites without express permission is in dubious legal / ethical territory.

Published on 2021-05-08