質問

I have do download all log files from a virtual directory within a site. The access to virtual directory is forbidden but files are accessible.

I have manually entered the file names to download

dir="Mar"
for ((i=1;i<100;i++)); do
   wget http://sz.dsyn.com/2014/$dir/log_$i.txt
done

The problem is the script is not generic and most of the time I need to find out how many files are there and tweak the for loop. Is there a way to trigger wget to fetch all files without me bothering to specify the exact count.

Note: If I use the browser to view http://sz.dsyn.com/2014/$dir, it is 403 forbidden. I cant pull all the files via browser tool/extension.

役に立ちましたか?

解決

First of all check this similar question If this is not what you are looking for, you need to generate a file of URLs within and feed wget. e.g.

 wget --input-file=http://sz.dsyn.com/2014/$dir/filelist.txt

他のヒント

wget will have the same problem your browser has: it cannot read the directory. Just pull until your first failure then quit.

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top