|
The simplest primality "test" is as follows: Given an input number N, we check each integer k > 1 other than N to see whether N is divisible by k. If so, then N is not prime.
Clearly if N is prime then this algorithm runs forever, so it is no true test.
The simplest true primality test is as follows: Given an input number N, we check whether it is divisible by any integer between 1 and N exclusive. If the answer is NO, then N is a prime, otherwise not. This relies on the trivial fact that if there are any factors, they cannot be greater than N and so must fall in this range.
A more convenient primality test is as follows: Given an input number N, we check whether it is divisible by any integer greater than 1 and less than or equal to the square root of N. If the answer is NO, then N is a prime, otherwise not. This relies on the not so trivial fact that if there are any factors other than 1 or N, they cannot all be greater than this square root.
A good way to speed up this method (and all the others mentioned below) is to pre-compute and store a list of all primes up to a certain bound, say all primes up to 200; such a list can be computed with the Sieve of Eratosthenes. Then, before testing N for primality with a serious method, one first checks whether N is divisible by any prime from the list.
Most popular primality tests are probablistic tests. In a certain sense, those tests are not really primality tests -- they do not determine with certainty whether a number is prime or not. The basic idea is as follows:
After several iterations, if N is not found to be a composite number, then it can be declared probably prime.
These tests are fast but not perfect. For a given test, there may be some composite numbers that will be declared "probably prime" no matter what witness is chosen. Such numbers are called pseudoprimes for that test.
The simplest probabilistic primality test is the Fermat primality test. It is sometimes used if a rapid screening of numbers is needed, for instance in the key generation phase of the RSA public key cryptographical algorithm. The Miller-Rabin test is a more sophisticated variant with a lower error rate; it is often the method of choice.
A deterministic primality test is the cyclotomy test[?]; its runtime can be proven to be O(nclog(log(n))), where n is the number of digits of N and c is a constant independent of n. This is slower than polynomial, but not much. In fact, for numbers N with up to 1000 digits or so, this is the fastest known method.
The elliptic curve primality test[?] can be proven to run in O(n6), but only if some still unproven (but widely assumed to be true) statements of analytic number theory are used. In practice, this test is slower than the cyclotomy test for numbers with up to 10,000 digits or so.
The implementation of these two methods is rather difficult, and their error probabilities in practice may therefore be even higher than those of the probabilistic methods mentioned above.
If the generalized Riemann hypothesis is assumed, the Miller-Rabin test can be turned into a deterministic version with runtime O(n4). In practice, this algorithm is slower than the other two for sizes of numbers that can be dealt with at all.
In 2002, Manindra Agarwal, Nitin Saxena and Neeraj Kayal described a new deterministic primality test which runs in O(n12), and this bound can be rigorously proven. As such, this provided the first deterministic primality test with provably polynomial run-time. In practice, this algorithm is slower than the other methods.
Search Encyclopedia
|
Featured Article
|