質問

I am trying to download a range of pages from a site.

The URL would be in the format: http://example.com/x where x could be any number from 100 to 200.

Is there any possible script that I could use to download all of the pages, ranging from example.com/100 to example.com/200 ?

役に立ちましたか?

解決

for x in {100..200}; do
    wget "http://example.com/$x"
done

他のヒント

If you have curl installed, you might want to use the relatively efficient following way:

curl -O 'http://example.com/[100-200]'

(yes, I know, curl is really cool!).

Faster version that does not risk breaking more complex urls with incorrect quoting:

for x in {100..200}; do
    wget 'http://example.com/'"$x" &
done
wait
ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top