Question

I've noticed that when working with Java's BigInteger class, elementary arithmetic operations seem to be way less efficient than their primitive counterparts, even when the same numbers and operations are used. An algorithm using a BI representation of a number takes an astronomically greater amount of time to run than the exact same algorithm using a long representation of the same number.

To illustrate what I mean, I've provided a working code example. In the example below, I simply iterate through all the integers between 1 and and 1000000000, performing a mod 2 operation on each iteration, then print out the total running time of the loop. I first do this using a long, then using a BigInteger:

import java.math.BigInteger;

public class FunWithNumbers {

    public static void main(String[] args) {
        
        long myNumL = 100000000L; // Long representation of some number n
        BigInteger myNumB = new BigInteger("100000000"); // BI representation of same number
        
        /* long version */
        long startTime = System.nanoTime();
        for (long i=1L; i<= myNumL; i++) {
            long a = myNumL % 2;
        }
        System.out.println("Total computation time (long representation): " + 
        (System.nanoTime() - startTime)*Math.pow(10, -9) + " seconds.");
        
        
        /* BI version */
        startTime = System.nanoTime();
        BigInteger index = new BigInteger("1");
        while (!index.equals(myNumB)) {
            BigInteger b = myNumB.remainder(index);
            index = index.add(BigInteger.ONE);
        }
        System.out.println("Total computation time (BI representation): " + 
        (System.nanoTime() - startTime)*Math.pow(10, -9) + " seconds.");
    }
}

This generates the following output:

Total computation time (long representation): 0.035671096 seconds.

Total computation time (BI representation): 7.031978092 seconds.

As you can see the running times don't even remotely compare. My problem is that I need to work with numbers that are too large to fit in a "long" data type.

Is there a way to get back the efficiency of primitive arithmetic and still be able to work with arbitrarily large numbers that exceed the max size of a long?

I'm fine with switching to another language if that's what it takes; I'm by no means tied down to Java.

Was it helpful?

Solution

I suggest you try C or C++. There are various C/C++ libraries for doing larger integer arithmetic efficiently. For instance I came across something called "ttmath" that looks promising.

A large part of the efficiency problem with BigInteger is that it models numbers with immutable objects. So each big arithmetic operation results in a new BigInteger being created.

OTHER TIPS

Is there a way to get back the efficiency of primitive arithmetic and still be able to work with arbitrarily large numbers that exceed the max size of a long?

No. In case of Java.

If you do not need exact answers, consider double. It has a wide range and very fast arithmetic, at the expense of only representing 53 significant bits.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top