Question

I have the following bash script which takes the tabular data as input, get the first line and spit them vertically:

#!/bin/bash  
# my_script.sh
export LC_ALL=C
file=$1
head -n1  $file |
tr "\t" "\n" |
awk '{print $1 " " NR-1}'

The problem is that I can only execute it this way:

$ myscript.sh some_tab_file.txt 

What I want to do is on top of the above capability also allows you to do this:

$ cat some_tab_file.txt myscript.sh | myscript.sh 

Namely take it from pipe output. How can I achieve that?

Was it helpful?

Solution

I'd normally write:

export LC_ALL=C
head -n1 "$@" |
tr "\t" "\n" |
awk '{print $1 " " NR-1}'

This works with any number of arguments, or none if there are none. Using "$@" is important in this and many other contexts. See the Bash manual on special parameters and shell parameter expansion for more information on the many and varied notations available for controlling how shell parameters are handled. Generally, double quotes are a good idea, especially if the file names may contain spaces.

OTHER TIPS

A common idiom is to fall back to the input file - if there are no parameters. There is a convenient shorthand for that;

file=${1--}

The substitution ${variable-fallback} evaluates to the variable's value, or fallback if it's unset.

I believe your script should work as-is, though; head will read standard input if the (unquoted!) file name you pass in evaluates to the empty string.

Take care to properly double-quote all interpolations of "$file", by the way; otherwise, your script won't work on filenames containing spaces or shell metacharacters. (Then you break the fortunate side effect of not passing a filename to head if your script did not receive one, though.)

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top