Many real-world problems have exponential time or space complexity, so that only very small problems are amenable to this approach. For example,
The traveling salesman problem is the classic example of this type of problem.
The field of algorithmic complexity theory was developed to deal with these issues.
Speeding up brute-force searches
In many cases problems can be transformed to reduce the search space. For example, in the proof of the four color theorem, a potentially infinite search space was first reduced to a set of finite problems by intensive consideration of mathematical issues, the finite problems were then reduced further in size by more theoretical work, and only then was the problem finished off by brute force search of the remaining possibilities.
Heuristics can also be used to make an early cutoff of parts of the search. One example of this is the minimax principle for searching game trees. This eliminates many subtrees at an early stage in the search, effectively doubling the depth of search that can be made for a given amount of computation.
In certain fields such as language parsing techniques such as chart parsing can exploit constraints in the problem to reduce an exponential complexity problem into a polynomial complexity problem.
In the anagram problem:
The search space for problems can also be reduced by replacing the full problem with a simplified version. For example, in computer chess, rather than computing the full minimax tree of all possible moves for the remainder of the game, a more limited tree of minimax possibilities is computed, with the tree being pruned at a certain number of moves, and the remainder of the tree being approximated by a static evaluation function.
Brute force attacks in cryptography
Brute force search is also important in cryptography, where a well-designed algorithm should only be breakable by a brute-force search of its key space.
A brute force attack against a cipher consists of breaking a cipher by trying all possible keys. Statistically, if the keys were originally chosen randomly, the plaintext will become available after about half of the possible keys are tried. The underlying assumption is, of course, that the cipher is known. Since A. Kerckoffs first published it, a fundamental maxim of cryptography has been that security must reside only in the key. As Claude E. Shannon said a few decades later, 'the enemy knows the system'. In practice, it has been excellent advice.
As of the year 2002, symmetric ciphers with keys 64 bits or less are vulnerable to brute force attacks. DES, a well respected symmetric algorithm which uses 56-bit keys, was broken by an EFF project in the late '90s. They even wrote a book about their exploit -- Cracking DES, O'Reilly and Assoc. The EFF is a non-profit cyberspace civil rights group; many people feel that well-funded organisations like the NSA can successfully attack using brute force a symmetric key cipher with a 64-bit key. This is surely true, as it has been done publicly. Many observers suggest a minimum key length for symmetric key algorithms of 128 bits.
The situation with regard to asymmetric algorithms is much more complicated and depends on the individual algorithm. Thus the currently breakable key length for the RSA algorithm is at least 512 bits (has been done publicly), but for most elliptic curve asymmetric algorithms, the largest currently breakable key length is believed to be rather shorter, perhaps as little as 128 bits or so. A message encrypted with a 109 bit key by an elliptic curve encryption algorithm was publicly broken by brute force key search in early 2003. At this writing, 128 bit key lengths seem reasonable for elliptic curve algorithms, and 1024 bits for such other asymmetric key algorithms as RSA.
See also:
Search Encyclopedia
|
Featured Article
|