In my bash script I use grep in different logs like this:

LOGS1=$(grep -E -i 'err|warn' /opt/backup/exports.log /opt/backup/imports.log && grep "tar:" /opt/backup/h2_backups.log /opt/backup/st_backups.log)

if [ -n "$LOGS1" ] ]; then
        COLOUR="yellow"
        MESSAGE="Logs contain warnings. Backups may be incomplete. Invetigate these warnings:\n$LOGS"

Instead of checking if each log exsist (there are many more logs than this) I want check stderr while the script runs to see if I get any output. If one of the logs does not exists it will produce an error like this: grep: /opt/backup/st_backups.log: No such file or directory

I've tried to read sterr with commands like command 2> >(grep "file" >&2 but that does not seem to work.

I know I can pipe the output to a file, but I rather just handle the stderr when there is any output instead of reading the file. OR is there any reason why pipe to file is better?

有帮助吗?

解决方案

Send the standard error (file descriptor 2) to standard output(file descriptor 1) and assign it to var Q:

$ Q=$(grep text file 2>&1)
$ echo $Q
grep: file: No such file or directory

其他提示

This is default behaviour, stderr is normally set to your terminal (and unbuffered) so you see errors as you pipe stdout somewhere. If you want to merge stderr with stdout then this is the syntax,

command >file 2>&1
许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top