質問

DRLAncillary_2014-01-25.tgz file doesn't exist under the remote directory CompressedArchivedAncillaryon FTP server host is.sci.gsfc.nasa.gov.

If I use the wget command with --spider option to determine its existence, the following output lines are displayed on the terminal window and pipelined to the output file /tmp/fileinfo.txt

wget --output-document=/dev/null --spider ftp://is.sci.gsfc.nasa.gov/CompressedArchivedAncillary/DRLAncillary_2014-01-25.tgz 2>&1 | tee /tmp/fileinfo.txt

==================================================================================
--2014-02-17 18:20:25--  ftp://is.sci.gsfc.nasa.gov/CompressedArchivedAncillary/DRLAncillary_2014-01-25.tgz
           => “/dev/null”
Resolving is.sci.gsfc.nasa.gov... 169.154.128.59
Connecting to is.sci.gsfc.nasa.gov|169.154.128.59|:21... connected.
Logging in as anonymous ... Logged in!
==> SYST ... done.    ==> PWD ... done.
==> TYPE I ... done.  ==> CWD (1) /CompressedArchivedAncillary ... done.
==> SIZE DRLAncillary_2014-01-25.tgz ... done.
==> PASV ... done.    --2014-02-17 18:20:29--  ftp://is.sci.gsfc.nasa.gov/CompressedArchivedAncillary/DRLAncillary_2014-01-25.tgz
           => “.listing”
==> CWD (1) /CompressedArchivedAncillary ... done.
==> PASV ... done.    ==> LIST ... done.

     0K .......... .......... .......... ...                   35.6K=0.9s

Removed “.listing”.

No such file “DRLAncillary_2014-01-25.tgz”.

==================================================================================

DRLAncillary_2014-01-15.tgz file really exists under the remote directory CompressedArchivedAncillary on FTP server host is.sci.gsfc.nasa.gov.

If I use the wget command with --spider option to determine its existence, the following output lines are displayed on the terminal window and pipelined to the output file /tmp/fileinfo.txt

wget --output-document=/dev/null --spider ftp://is.sci.gsfc.nasa.gov/CompressedArchivedAncillary/DRLAncillary_2014-01-15.tgz 2>&1 | tee /tmp/fileinfo.txt

==================================================================================
--2014-02-17 18:22:18--  ftp://is.sci.gsfc.nasa.gov/CompressedArchivedAncillary/DRLAncillary_2014-01-15.tgz
           => “/dev/null”
Resolving is.sci.gsfc.nasa.gov... 169.154.128.59
Connecting to is.sci.gsfc.nasa.gov|169.154.128.59|:21... connected.
Logging in as anonymous ... Logged in!
==> SYST ... done.    ==> PWD ... done.
==> TYPE I ... done.  ==> CWD (1) /CompressedArchivedAncillary ... done.
==> SIZE DRLAncillary_2014-01-15.tgz ... 1811109782
==> PASV ... done.    --2014-02-17 18:22:21--  ftp://is.sci.gsfc.nasa.gov/CompressedArchivedAncillary/DRLAncillary_2014-01-15.tgz
           => “.listing”
==> CWD (1) /CompressedArchivedAncillary ... done.
==> PASV ... done.    ==> LIST ... done.

     0K .......... .......... .......... ...                   42.0K=0.8s

Removed “.listing”.

File “DRLAncillary_2014-01-15.tgz” exists.
==================================================================================

The wget command with --spider option can fetch the messages of what I want, and store them to the certain file. As you may see from the above listed two paragraphs, which are delimited on both the top and bottom with successive equal sign formed lines:

When the search file doesn't exist, the last line of the output is File “DRLAncillary_2014-01-15.tgz” exists.

When the search really exists, the last line of the output reads No such file “DRLAncillary_2014-01-25.tgz”.

==================================================================================

So my questions are:

  1. How can I redirect (with pipe line operator) the wget output to one or more filter-enabled commands such as sed, grep, xargs, awk, tail, and etc. to extract only the last line from within the stream buffer without firstly storing the output in a fixed file, and then reading the last line from that file? Furthermore, I hope the extracted string will be held in a variable, and I don't expect screen quite/clear instead of echoing any piece of message too.

  2. At present, I can use the following composite command to get the last line from the output file /tmp/fileinfo.txt: tail -2 /tmp/fileinfo.txt | head -1

    Although this isn't my ideal solution for a simple file existence checking, if your bash script guru guys can do me a favor to do somewhat modification to my wget command line to echo off terminal window prints, I can make do with it.

Thank you in advance!

役に立ちましたか?

解決

Sorry, I don't know if I understood the question.. but have you tried:

wget ... | awk '/^File/ {print}; /No such/ {print}';

root@stormtrooper:~# wget --output-document=/dev/null --spider ftp://is.sci.gsfc.nasa.gov/CompressedArchivedAncillary/DRLAncillary_2014-01-25.tgz 2>&1 | awk '/^File/ {print}; /^No such/{print}'
No such file ‘DRLAncillary_2014-01-25.tgz’.
root@stormtrooper:~# wget --output-document=/dev/null --spider ftp://is.sci.gsfc.nasa.gov/CompressedArchivedAncillary/DRLAncillary_2014-01-15.tgz 2>&1  | awk '/^File/ {print}; /^No such/{print}'
File ‘DRLAncillary_2014-01-15.tgz’ exists.

Sorry the poor English

他のヒント

Can't you just pipe through head and tail without going via a file:

wget --output-document=/dev/null --spider ftp://is.sci.gsfc.nasa.gov/CompressedArchivedAncillary/DRLAncillary_2014-01-15.tgz 2>&1 | tail -2 | head -1

How can I redirect (with pipe line operator) the wget output to one or more filter-enabled commands such as sed, grep, xargs, awk, tail, and etc. to extract only the last line from within the stream buffer without firstly storing the output in a fixed file, and then reading the last line from that file? Furthermore, I hope the extracted string will be held in a variable, and I don't expect screen quite/clear instead of echoing any piece of message too.

 LAST_LINE=$(wget -O- http://URL 2> /dev/null | tail -1)
ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top