Question

Using any tools which you would expect to find on a nix system (in fact, if you want, msdos is also fine too), what is the easiest/fastest way to calculate the mean of a set of numbers, assuming you have them one per line in a stream or file?

Was it helpful?

Solution

Awk

awk '{total += $1; count++ } END {print total/count}'

OTHER TIPS

awk ' { n += $1 }; END { print n / NR }'

This accumulates the sum in n, then divides by the number of items (NR = Number of Records).

Works for integers or reals.

Using Num-Utils for UNIX:

average 1 2 3 4 5 6 7 8 9
perl -e 'while (<>) { $sum += $_; $count++ } print $sum / $count, "\n"';

In Powershell, it would be

get-content .\meanNumbers.txt | measure-object -average

Of course, that's the verbose syntax. If you typed it using aliases,

gc .\meanNumbers.txt | measure-object -a

Using "st" (https://github.com/nferraz/st):

$ st numbers.txt
N      min   max    sum    mean  sd
10.00  1.00  10.00  55.00  5.50  3.03

Specify an option to see individual stats:

$ st numbers.txt --mean
5.5

(DISCLAIMER: I wrote this tool :))

Perl.

@a = <STDIN>;

for($i = 0; $i < #@a; $i++)
{
   $sum += $a[i];
}

print $a[i]/#@a;

Caveat Emptor: My syntax may be a little whiffly.

Ruby one liner

cat numbers.txt | ruby -ne 'BEGIN{$sum=0}; $sum=$sum+$_.to_f; END{puts $sum/$.}'

source

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top