Question

I'm relatively new to Bash scripting, and finally thought of something that would be a good introduction to it. I have a collection of sorting programs I'm trying to time. Like most tests, it's good to get a large sample size, but it's hard to do it consistently. I figured that automating the process with a Bash script would be a good way to do it, but I don't do much Bash.

The sorting programs are written in C++ and output how long it took for them to sort an array of 10000 integer values read in from a file. I'm using a few different methods to sort the array, including bubble sort, quick sort, and parallelized (Boost threads) quick sort. At the end of their execution a time is output to the console and execution is halted. What I'd like to do in the Bash script is...

for 1 to 100:
    ./quicksortpar --this is the command to start the program
    take time reading from output, place in collection

--when that's done
for 1 to 100 in the collection:
    add each item in the collection to a running total

--when that's done
echo running total/ 100

How would I go about accomplishing this in Bash? Is it possible?

EDIT:

Here's the current Zsh script I have from Tony D's guidance:

enter image description here

Was it helpful?

Solution 2

(Update: accidentally worked this out in zsh - doesn't work in bash)

TOTAL=0
for ((i=1; i<=100; i++))
do
    let TOTAL+=$(./quicksortpar)
done
let AVG=TOTAL/10
echo $AVG

OTHER TIPS

You can use gnu for floating point arithmetic in bash. So do something like below

#!/bin/bash
declare -a coll
for _ in  {1..100}; do
  coll+=("$(./quicksortpar)")
done

sum=0
for i in ${coll[@]}; do
  sum="$(echo "$sum + $i" | bc -l)"
done

echo "$sum / ${#coll[@]}" | bc -l        

Note, the timings can be summed once instead of in a loop, per Jonathan Leffler's suggestion

sum=$( { printf "%d+" "${coll[@]}"; echo 0; } | bc -l)
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top