How to download a full web page (css, js and images included) and all linked web pages

StackOverflow https://stackoverflow.com/questions/21501321

  •  05-10-2022
  •  | 
  •  

質問

This command gets all the files that are necessary to properly display a given html page.

wget --page-requisites http://example.com/your/page.html

I want to loop through for all the links on that page ie the a href's and apply the same command (or similar, doesn't have to be bash) to them.

役に立ちましたか?

解決

wget -r -l 2 --page-requisites http://example.com/your/page.html

See man wget

Recursive Retrieval Options

   -r

   --recursive
       Turn on recursive retrieving.    The default maximum depth is 5.

   -l depth
   --level=depth
       Specify recursion maximum depth level depth.
ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top