Download all linked files from a web page

Tags: linux command-line

Here is a simple method to download all files linked from a web page:

wget -r -np -l 6 -nd -A zip ''

It should go without saying but be wary of how this is used. Information may be public, but your access might get blocked, and scraping websites without express permission is in dubious legal / ethical territory.

Published on 08 May 2021