Question

I have multiple piped commands, like this:

find [options] | grep [options] | xargs grep [options]

Each one of them can potentially produce errors (permissions errors, spaces-in-filenames errors, etc) that I am not interested in. So, I want to redirect all errors to /dev/null. I know I can do this with 2>/dev/null, for each command. Can I set IO redirection persistently? Ideally, I would just set it once, at the beginning/end of the command, and then it would affect all subsequent/preceding piped commands. Also, can IO redirection be set permanently, so that it continues to affect all commands until it is reset?

I'm using bash (I checked the man page for bash builtins and did not see the '>' and '<' characters at the top, so I assumed it was a linux thing... sorry)

Was it helpful?

Solution

I'm going to assume that you're using bash, or at least some sort of Bourne-like shell.

I'm also assuming that what you want to avoid is the following:

find ... 2>/dev/null | grep ... 2>/dev/null | xargs ... 2>/dev/null

i.e. repeating the 2>/dev/null part for each segment of the pipeline.

You can do that with:

( find ... | grep ... | xargs ... ) 2>/dev/null

You can also set the redirection permanently as follows:

exec 2>/dev/null

and (assuming that STDOUT and STDERR were both pointing to the same place before), you can undo that with:

exec 2>&1

OTHER TIPS

Start a new shell with a redirected stderr, and run your pipeline in there?

Something like

$ bash 2>/dev/null -c "find [options] | grep [options] | xargs grep [options]"

It seems to work (i.e. no visible output) when I do

$ bash 2>/dev/null -c "echo nope 1>&2"

Of course, a side effect is that it is going to be tricky to understand what happened in the event of anything unexpected going on, but that's your business.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top