In graphical form:
where w(T) is the minimum total weight and (u,v) is an edge between vertices u and v.
One example would be a cable TV company laying cable to a new neighborhood. If it is constrained to bury the cable only along certain paths, then there would be a graph representing which points are connected by those paths. Some of those paths might be more expensive, because they are longer, or require the cable to be buried deeper. A spanning tree for that graph would be a subset of those paths that has no cycles but still connects to every house. There might be several spanning trees possible. A minimum spanning tree would be one with the lowest total cost. In case of a tie, there could be several minimum spanning trees.
The first algorithm for finding a minimum spanning tree was developed by Czech scientist Otakar Borůvka[?] in 1926. Its purpose was an efficient electrical coverage of Bohemia. There are now two algorithms commonly used, Prim's algorithm and Kruskal's algorithm. Both are greedy algorithms. Both run in polynomial time, so the problem of finding such trees is in P.
The fastest minimum spanning tree algorithm to date was developed by Chazelle, and based on Borůvka's. Its running time is O(m α(m,n)), where m is the number of edges, n refers to the number of vertices and α is the classical functional inverse of Ackermann function.
What is the fastest possible algorithm for this problem? That is one of the oldest open questions in computer science. If the edge weights are integers with a bounded bit length, then deterministic algorithms are known with linear running time, O(m). For general weights, randomized algorithms[?] are known that run in linear expected time. Whether there exists a deterministic algorithm with linear running time for general weights is still an open question.
References
Search Encyclopedia
|
Featured Article
|