Question

I am looking for an algorithm that would make it possible to prove any large number for primality. By large number, I mean numbers with at least 100,000,000 decimal digits in them and which cannot be expressed with simple formulas like Mersenne primes, etc.

Here are my requirements:

1- it must be completely correct

2- it must be runnable on a basic home computer

3- It must complete it's course within a few weeks or months.

My memory limitation is 8 GB of ram (I can set my options on how much cache is available) on a dedicated machine with a 1tb hard drive. I will be considering numbers one at a time over the course of several months.

Edit1: I am well aware of the fact that this is a difficult arena to compete in, if not nearly impossible using current methods. I'm not using current methods, and I need a way of proving my methods correct for very large numbers.

Edit2: Part of the reason why I need a non-probabilistic method is because this will be an attempt at an EFF award, and, succeeding there, at a second EFF award. If my methods are correct (and that's one honking IF), I should be able to do all that with my notebook PC.

Was it helpful?

Solution

From what I understand, you are looking for a algorithm which is:

  • deterministic
  • fairly fast
  • with a fairly reasonable memory footprint

I'm not sure about the last part, since I never tried to implement it (yet), but I know that the primality problem was solved for general numbers. In other words, you can determine if a number is prime or not, with 100% certainty, regardless of the form. You should take a look at the AKS primality test which is the first known algorithm that solves this problem.

Like you said, it may take a while to run, but it will eventually finish with a certain answer. The complexity of the optimized version is O((log n)^7.5 (log n is proportional with the number of digits of n).

However, since this runtime is pretty large and you want to test for a lot of numbers, you should consider filtering such numbers with a faster, non-deterministic algorithm first. In other words, I would try to run the Miller-Rabin test for a few numbers (a=2,3,5,7,... - the first ten primes should be enough, but even if you really want a better accuracy, you probably shouldn't go beyond 50 primes). The probability that the tested number is a false-prime after k Miller-Rabin tests is less than 1/4^k if I remember correctly. In other words, you can run a small number of tests (not to mention these tests are very fast) and be very confident if the number is prime (if any of these tests fail, the number is definitely NOT prime). After all the MR tests pass, run the AKS algorithm on the tested number just to make sure (as you can see on the MR wikipedia page, the smallest false positive after doing even a few tests increases at a very fast rate).

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top