Question

In C, if I want to divide an int by 2, x%2 should run as fast as (x%10)% 2 because a good compiler will just look at the last bit. But what about in a language with infinite precision arithmetic?

In particular, in Haskell which would be faster (or would they be the same speed): even x or even (quot x 10)?

Was it helpful?

Solution

Okay, I'll bite:

import Control.DeepSeq
import Criterion.Main
import Data.Bits
import System.Random

lotsOfBigNumbers :: [Integer]
lotsOfBigNumbers = take 10000 $ randomRs (2^128, 2^132) (mkStdGen 42)

evenRem, evenBits :: Integer -> Bool
evenRem  x = even (x `rem` 10)
evenBits x = (x .&. 1) == 0
remRem   x = ((x `rem` 10) `rem` 2) == 0

main = do
    deepseq lotsOfBigNumbers (return ())
    defaultMain
        [ bench "even"     $ nf (map even    ) lotsOfBigNumbers
        , bench "evenRem"  $ nf (map evenRem ) lotsOfBigNumbers
        , bench "evenBits" $ nf (map evenBits) lotsOfBigNumbers
        , bench "remRem"   $ nf (map remRem  ) lotsOfBigNumbers
        ]

And the results:

sorghum:~/programming% ghc -O2 test && ./test
[1 of 1] Compiling Main             ( test.hs, test.o )
Linking test ...
warming up
estimating clock resolution...
mean is 1.920340 us (320001 iterations)
found 51108 outliers among 319999 samples (16.0%)
  46741 (14.6%) low severe
  4367 (1.4%) high severe
estimating cost of a clock call...
mean is 83.20445 ns (16 iterations)
found 4 outliers among 16 samples (25.0%)
  2 (12.5%) low mild
  2 (12.5%) high severe

benchmarking even
mean: 1.508542 ms, lb 1.503661 ms, ub 1.514950 ms, ci 0.950
std dev: 28.53574 us, lb 23.35796 us, ub 35.19769 us, ci 0.950
found 18 outliers among 100 samples (18.0%)
  17 (17.0%) high severe
variance introduced by outliers: 11.371%
variance is moderately inflated by outliers

benchmarking evenRem
mean: 1.937735 ms, lb 1.930118 ms, ub 1.949699 ms, ci 0.950
std dev: 48.17240 us, lb 34.95334 us, ub 71.22055 us, ci 0.950
found 14 outliers among 100 samples (14.0%)
  3 (3.0%) high mild
  11 (11.0%) high severe
variance introduced by outliers: 18.989%
variance is moderately inflated by outliers

benchmarking evenBits
mean: 996.3537 us, lb 992.2839 us, ub 1.003864 ms, ci 0.950
std dev: 27.37875 us, lb 17.51923 us, ub 43.98499 us, ci 0.950
found 15 outliers among 100 samples (15.0%)
  2 (2.0%) high mild
  13 (13.0%) high severe
variance introduced by outliers: 21.905%
variance is moderately inflated by outliers

benchmarking remRem
mean: 1.925495 ms, lb 1.918590 ms, ub 1.935869 ms, ci 0.950
std dev: 43.00092 us, lb 31.67173 us, ub 57.83841 us, ci 0.950
found 13 outliers among 100 samples (13.0%)
  13 (13.0%) high severe
variance introduced by outliers: 15.198%
variance is moderately inflated by outliers

Conclusion: an extra rem costs a bit more, and .&. costs a bit less.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top