質問

I am actually very new to C, but for a project, I'd like to be able to calculate the value of Pi from 1 million to at least 32 million decimal places. Basically, like what SuperPi/HyperPi does for benchmarking a CPU.

But obviously, the standard C library is incapable of this.

What library can I use, and what algorithm do I use for this task?

And precision too, anyone can cook up a rand() bloat and call it the "Ultimate value of Pi".

My compiler is GCC, so if possible, I'd like the library to be able to compile on it(I have the BigNum library).

役に立ちましたか?

解決

I'v used the quadratic algorithm from there with success. I'd suggest MPFR for the library part.

他のヒント

As for the algorithm, see http://en.wikipedia.org/wiki/Chudnovsky_algorithm. For a library to deal with bignums, check http://en.wikipedia.org/wiki/Arbitrary-precision_arithmetic#Libraries. Have fun.

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top