kruskal 2
TRANSCRIPT
-
8/4/2019 Kruskal 2
1/16
Kruskal's algorithmFrom Wikipedia, the free encyclopedia
Graphandtree search
algorithms
Search
Alpha-beta pruning
A*
B*
Beam search
BellmanFord algorithm
Best-first search
Bidirectional search
Breadth-first search
D*
Depth-first search
Depth-limited search
Dijkstra's algorithm
FloydWarshall algorithm
Hill climbing
Iterative deepening depth-
first search
Johnson's algorithm
Lexicographic breadth-first
search
Uniform-cost search
More
Related
Dynamic programming
Search games
Kruskal's algorithm is analgorithmingraph theorythat finds aminimum spanning treefor
aconnectedweighted graph. This means it finds a subset of theedgesthat forms a tree that includes
everyvertex, where the total weight of all the edges in the tree is minimized. If the graph is not connected,
then it finds a minimum spanning forest(a minimum spanning tree for eachconnected component).
Kruskal's algorithm is an example of agreedy algorithm.
http://en.wikipedia.org/wiki/Graph_traversalhttp://en.wikipedia.org/wiki/Graph_traversalhttp://en.wikipedia.org/wiki/Tree_traversalhttp://en.wikipedia.org/wiki/Tree_traversalhttp://en.wikipedia.org/wiki/Tree_traversalhttp://en.wikipedia.org/wiki/Tree_traversalhttp://en.wikipedia.org/wiki/Alpha-beta_pruninghttp://en.wikipedia.org/wiki/Alpha-beta_pruninghttp://en.wikipedia.org/wiki/A*_search_algorithmhttp://en.wikipedia.org/wiki/A*_search_algorithmhttp://en.wikipedia.org/wiki/B*http://en.wikipedia.org/wiki/B*http://en.wikipedia.org/wiki/Beam_searchhttp://en.wikipedia.org/wiki/Beam_searchhttp://en.wikipedia.org/wiki/Bellman%E2%80%93Ford_algorithmhttp://en.wikipedia.org/wiki/Bellman%E2%80%93Ford_algorithmhttp://en.wikipedia.org/wiki/Bellman%E2%80%93Ford_algorithmhttp://en.wikipedia.org/wiki/Bellman%E2%80%93Ford_algorithmhttp://en.wikipedia.org/wiki/Best-first_searchhttp://en.wikipedia.org/wiki/Best-first_searchhttp://en.wikipedia.org/wiki/Bidirectional_searchhttp://en.wikipedia.org/wiki/Bidirectional_searchhttp://en.wikipedia.org/wiki/Breadth-first_searchhttp://en.wikipedia.org/wiki/Breadth-first_searchhttp://en.wikipedia.org/wiki/D*http://en.wikipedia.org/wiki/D*http://en.wikipedia.org/wiki/Depth-first_searchhttp://en.wikipedia.org/wiki/Depth-first_searchhttp://en.wikipedia.org/wiki/Depth-limited_searchhttp://en.wikipedia.org/wiki/Depth-limited_searchhttp://en.wikipedia.org/wiki/Dijkstra%27s_algorithmhttp://en.wikipedia.org/wiki/Dijkstra%27s_algorithmhttp://en.wikipedia.org/wiki/Floyd%E2%80%93Warshall_algorithmhttp://en.wikipedia.org/wiki/Floyd%E2%80%93Warshall_algorithmhttp://en.wikipedia.org/wiki/Floyd%E2%80%93Warshall_algorithmhttp://en.wikipedia.org/wiki/Floyd%E2%80%93Warshall_algorithmhttp://en.wikipedia.org/wiki/Hill_climbinghttp://en.wikipedia.org/wiki/Hill_climbinghttp://en.wikipedia.org/wiki/Iterative_deepening_depth-first_searchhttp://en.wikipedia.org/wiki/Iterative_deepening_depth-first_searchhttp://en.wikipedia.org/wiki/Iterative_deepening_depth-first_searchhttp://en.wikipedia.org/wiki/Iterative_deepening_depth-first_searchhttp://en.wikipedia.org/wiki/Iterative_deepening_depth-first_searchhttp://en.wikipedia.org/wiki/Johnson%27s_algorithmhttp://en.wikipedia.org/wiki/Johnson%27s_algorithmhttp://en.wikipedia.org/wiki/Lexicographic_breadth-first_searchhttp://en.wikipedia.org/wiki/Lexicographic_breadth-first_searchhttp://en.wikipedia.org/wiki/Lexicographic_breadth-first_searchhttp://en.wikipedia.org/wiki/Lexicographic_breadth-first_searchhttp://en.wikipedia.org/wiki/Lexicographic_breadth-first_searchhttp://en.wikipedia.org/wiki/Uniform-cost_searchhttp://en.wikipedia.org/wiki/Uniform-cost_searchhttp://en.wikipedia.org/wiki/Category:Search_algorithmshttp://en.wikipedia.org/wiki/Dynamic_programminghttp://en.wikipedia.org/wiki/Dynamic_programminghttp://en.wikipedia.org/wiki/Search_gameshttp://en.wikipedia.org/wiki/Search_gameshttp://en.wikipedia.org/wiki/Algorithmhttp://en.wikipedia.org/wiki/Algorithmhttp://en.wikipedia.org/wiki/Algorithmhttp://en.wikipedia.org/wiki/Graph_theoryhttp://en.wikipedia.org/wiki/Graph_theoryhttp://en.wikipedia.org/wiki/Graph_theoryhttp://en.wikipedia.org/wiki/Minimum_spanning_treehttp://en.wikipedia.org/wiki/Minimum_spanning_treehttp://en.wikipedia.org/wiki/Minimum_spanning_treehttp://en.wikipedia.org/wiki/Connectivity_(graph_theory)http://en.wikipedia.org/wiki/Connectivity_(graph_theory)http://en.wikipedia.org/wiki/Glossary_of_graph_theory#Weighted_graphs_and_networkshttp://en.wikipedia.org/wiki/Glossary_of_graph_theory#Weighted_graphs_and_networkshttp://en.wikipedia.org/wiki/Glossary_of_graph_theory#Weighted_graphs_and_networkshttp://en.wikipedia.org/wiki/Edge_(graph_theory)http://en.wikipedia.org/wiki/Edge_(graph_theory)http://en.wikipedia.org/wiki/Edge_(graph_theory)http://en.wikipedia.org/wiki/Vertex_(graph_theory)http://en.wikipedia.org/wiki/Vertex_(graph_theory)http://en.wikipedia.org/wiki/Vertex_(graph_theory)http://en.wikipedia.org/wiki/Connected_component_(graph_theory)http://en.wikipedia.org/wiki/Connected_component_(graph_theory)http://en.wikipedia.org/wiki/Connected_component_(graph_theory)http://en.wikipedia.org/wiki/Greedy_algorithmhttp://en.wikipedia.org/wiki/Greedy_algorithmhttp://en.wikipedia.org/wiki/Greedy_algorithmhttp://en.wikipedia.org/wiki/Greedy_algorithmhttp://en.wikipedia.org/wiki/Connected_component_(graph_theory)http://en.wikipedia.org/wiki/Vertex_(graph_theory)http://en.wikipedia.org/wiki/Edge_(graph_theory)http://en.wikipedia.org/wiki/Glossary_of_graph_theory#Weighted_graphs_and_networkshttp://en.wikipedia.org/wiki/Connectivity_(graph_theory)http://en.wikipedia.org/wiki/Minimum_spanning_treehttp://en.wikipedia.org/wiki/Graph_theoryhttp://en.wikipedia.org/wiki/Algorithmhttp://en.wikipedia.org/wiki/Search_gameshttp://en.wikipedia.org/wiki/Dynamic_programminghttp://en.wikipedia.org/wiki/Category:Search_algorithmshttp://en.wikipedia.org/wiki/Uniform-cost_searchhttp://en.wikipedia.org/wiki/Lexicographic_breadth-first_searchhttp://en.wikipedia.org/wiki/Lexicographic_breadth-first_searchhttp://en.wikipedia.org/wiki/Johnson%27s_algorithmhttp://en.wikipedia.org/wiki/Iterative_deepening_depth-first_searchhttp://en.wikipedia.org/wiki/Iterative_deepening_depth-first_searchhttp://en.wikipedia.org/wiki/Hill_climbinghttp://en.wikipedia.org/wiki/Floyd%E2%80%93Warshall_algorithmhttp://en.wikipedia.org/wiki/Dijkstra%27s_algorithmhttp://en.wikipedia.org/wiki/Depth-limited_searchhttp://en.wikipedia.org/wiki/Depth-first_searchhttp://en.wikipedia.org/wiki/D*http://en.wikipedia.org/wiki/Breadth-first_searchhttp://en.wikipedia.org/wiki/Bidirectional_searchhttp://en.wikipedia.org/wiki/Best-first_searchhttp://en.wikipedia.org/wiki/Bellman%E2%80%93Ford_algorithmhttp://en.wikipedia.org/wiki/Beam_searchhttp://en.wikipedia.org/wiki/B*http://en.wikipedia.org/wiki/A*_search_algorithmhttp://en.wikipedia.org/wiki/Alpha-beta_pruninghttp://en.wikipedia.org/wiki/Tree_traversalhttp://en.wikipedia.org/wiki/Tree_traversalhttp://en.wikipedia.org/wiki/Graph_traversal -
8/4/2019 Kruskal 2
2/16
This algorithm first appeared inProceedings of the American Mathematical Society, pp. 4850 in 1956, and
was written byJoseph Kruskal.
Other algorithms for this problem includePrim's algorithm,Reverse-Delete algorithm, andBorvka's
algorithm.
Contents
[hide]
1 Description
2 Performance
3 Pseudocode
4 Example
5 Proof of correctness
o 5.1 Spanning tree
o 5.2 Minimality
6 See also
7 References
8 External links
[edit]Description
create a forest F(a set of trees), where each vertex in the graph is a separatetree
create a set Scontaining all the edges in the graph
while Sisnonemptyand F is not yet spanning
remove an edge with minimum weight from S
if that edge connects two different trees, then add it to the forest, combining two trees into a single
tree
otherwise discard that edge.
At the termination of thealgorithm, the forest has only one component and forms a minimum spanning tree
of the graph.
[edit]Performance
Where Eis the number of edges in the graph and Vis the number of vertices, Kruskal's algorithm can be
shown to run inO(ElogE) time, or equivalently, O(Elog V) time, all with simple data structures. These
running times are equivalent because:
Eis at most V2 and logV2
= 2logV is O(log V).
If we ignore isolated vertices, which will each be their own component of the minimum spanning
forest, VE+1, so log Vis O(log E).
http://en.wikipedia.org/wiki/Proceedings_of_the_American_Mathematical_Societyhttp://en.wikipedia.org/wiki/Proceedings_of_the_American_Mathematical_Societyhttp://en.wikipedia.org/wiki/Proceedings_of_the_American_Mathematical_Societyhttp://en.wikipedia.org/wiki/Joseph_Kruskalhttp://en.wikipedia.org/wiki/Joseph_Kruskalhttp://en.wikipedia.org/wiki/Joseph_Kruskalhttp://en.wikipedia.org/wiki/Prim%27s_algorithmhttp://en.wikipedia.org/wiki/Prim%27s_algorithmhttp://en.wikipedia.org/wiki/Prim%27s_algorithmhttp://en.wikipedia.org/wiki/Reverse-Delete_algorithmhttp://en.wikipedia.org/wiki/Reverse-Delete_algorithmhttp://en.wikipedia.org/wiki/Reverse-Delete_algorithmhttp://en.wikipedia.org/wiki/Bor%C5%AFvka%27s_algorithmhttp://en.wikipedia.org/wiki/Bor%C5%AFvka%27s_algorithmhttp://en.wikipedia.org/wiki/Bor%C5%AFvka%27s_algorithmhttp://en.wikipedia.org/wiki/Bor%C5%AFvka%27s_algorithmhttp://en.wikipedia.org/wiki/Kruskal's_algorithmhttp://en.wikipedia.org/wiki/Kruskal's_algorithmhttp://en.wikipedia.org/wiki/Kruskal's_algorithmhttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Descriptionhttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Descriptionhttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Performancehttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Performancehttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Pseudocodehttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Pseudocodehttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Examplehttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Examplehttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Proof_of_correctnesshttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Proof_of_correctnesshttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Spanning_treehttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Spanning_treehttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Minimalityhttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Minimalityhttp://en.wikipedia.org/wiki/Kruskal's_algorithm#See_alsohttp://en.wikipedia.org/wiki/Kruskal's_algorithm#See_alsohttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Referenceshttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Referenceshttp://en.wikipedia.org/wiki/Kruskal's_algorithm#External_linkshttp://en.wikipedia.org/wiki/Kruskal's_algorithm#External_linkshttp://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=1http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=1http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=1http://en.wikipedia.org/wiki/Tree_(graph_theory)http://en.wikipedia.org/wiki/Tree_(graph_theory)http://en.wikipedia.org/wiki/Tree_(graph_theory)http://en.wikipedia.org/wiki/Nonemptyhttp://en.wikipedia.org/wiki/Nonemptyhttp://en.wikipedia.org/wiki/Nonemptyhttp://en.wikipedia.org/wiki/Algorithmhttp://en.wikipedia.org/wiki/Algorithmhttp://en.wikipedia.org/wiki/Algorithmhttp://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=2http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=2http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=2http://en.wikipedia.org/wiki/Big-O_notationhttp://en.wikipedia.org/wiki/Big-O_notationhttp://en.wikipedia.org/wiki/Big-O_notationhttp://en.wikipedia.org/wiki/Binary_logarithmhttp://en.wikipedia.org/wiki/Binary_logarithmhttp://en.wikipedia.org/wiki/Binary_logarithmhttp://en.wikipedia.org/wiki/Binary_logarithmhttp://en.wikipedia.org/wiki/Big-O_notationhttp://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=2http://en.wikipedia.org/wiki/Algorithmhttp://en.wikipedia.org/wiki/Nonemptyhttp://en.wikipedia.org/wiki/Tree_(graph_theory)http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=1http://en.wikipedia.org/wiki/Kruskal's_algorithm#External_linkshttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Referenceshttp://en.wikipedia.org/wiki/Kruskal's_algorithm#See_alsohttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Minimalityhttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Spanning_treehttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Proof_of_correctnesshttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Examplehttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Pseudocodehttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Performancehttp://en.wikipedia.org/wiki/Kruskal's_algorithm#Descriptionhttp://en.wikipedia.org/wiki/Kruskal's_algorithmhttp://en.wikipedia.org/wiki/Bor%C5%AFvka%27s_algorithmhttp://en.wikipedia.org/wiki/Bor%C5%AFvka%27s_algorithmhttp://en.wikipedia.org/wiki/Reverse-Delete_algorithmhttp://en.wikipedia.org/wiki/Prim%27s_algorithmhttp://en.wikipedia.org/wiki/Joseph_Kruskalhttp://en.wikipedia.org/wiki/Proceedings_of_the_American_Mathematical_Society -
8/4/2019 Kruskal 2
3/16
We can achieve this bound as follows: first sort the edges by weight using a comparison sortin O(Elog E)
time; this allows the step "remove an edge with minimum weight from S" to operate in constant time. Next,
we use adisjoint-set data structure(Union&Find) to keep track of which vertices are in which components.
We need to perform O(E) operations, two 'find' operations and possibly one union for each edge. Even a
simple disjoint-set data structure such as disjoint-set forests with union by rank can perform O( E)
operations in O(Elog V) time. Thus the total time is O(Elog E) = O(Elog V).
Provided that the edges are either already sorted or can be sorted in linear time (for example withcounting
sortorradix sort), the algorithm can use more sophisticateddisjoint-set data structureto run in O(E(V))
time, where is the extremely slowly-growing inverse of the single-valuedAckermann function.
[edit]Pseudocode
1function
Kruskal(G = : graph; length: A R+
): set of edges2 Define an elementary cluster C(v) {v}
3 Initialize a priority queue Qto contain all edges in G, using the
weights as keys.
4 Define a forest T //Twill ultimately contain the edges of
the MST
5 // n is total number of vertices
6 whileThas fewer than n-1 edges do
7 // edge u,v is the minimum weighted route from u to v
8 (u,v) Q.removeMin()
9 // prevent cycles in T. add u,v only if T does not already contain
a path between u and v.
10 // the vertices has been added to the tree.
11 Let C(v) be the cluster containing v, and let C(u) be the cluster
containing u.
13 ifC(v) C(u) then
14 Add edge (v,u) to T.
15 Merge C(v) and C(u) into one cluster, that is, union C(v) and
C(u).
16 return tree T
[edit]Example
Image Description
http://en.wikipedia.org/wiki/Comparison_sorthttp://en.wikipedia.org/wiki/Comparison_sorthttp://en.wikipedia.org/wiki/Comparison_sorthttp://en.wikipedia.org/wiki/Disjoint-set_data_structurehttp://en.wikipedia.org/wiki/Disjoint-set_data_structurehttp://en.wikipedia.org/wiki/Disjoint-set_data_structurehttp://en.wikipedia.org/wiki/Counting_sorthttp://en.wikipedia.org/wiki/Counting_sorthttp://en.wikipedia.org/wiki/Counting_sorthttp://en.wikipedia.org/wiki/Counting_sorthttp://en.wikipedia.org/wiki/Radix_sorthttp://en.wikipedia.org/wiki/Radix_sorthttp://en.wikipedia.org/wiki/Radix_sorthttp://en.wikipedia.org/wiki/Disjoint-set_data_structurehttp://en.wikipedia.org/wiki/Disjoint-set_data_structurehttp://en.wikipedia.org/wiki/Disjoint-set_data_structurehttp://en.wikipedia.org/wiki/Ackermann_functionhttp://en.wikipedia.org/wiki/Ackermann_functionhttp://en.wikipedia.org/wiki/Ackermann_functionhttp://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=3http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=3http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=3http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=4http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=4http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=4http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=4http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=3http://en.wikipedia.org/wiki/Ackermann_functionhttp://en.wikipedia.org/wiki/Disjoint-set_data_structurehttp://en.wikipedia.org/wiki/Radix_sorthttp://en.wikipedia.org/wiki/Counting_sorthttp://en.wikipedia.org/wiki/Counting_sorthttp://en.wikipedia.org/wiki/Disjoint-set_data_structurehttp://en.wikipedia.org/wiki/Comparison_sort -
8/4/2019 Kruskal 2
4/16
This is our original graph. The numbers near the arcs indicate theirweight. None of the arcs are highlighted.
AD and CE are the shortest arcs, with length 5, and AD hasbeenarbitrarilychosen, so it is highlighted.
CE is now the shortest arc that does not form a cycle, with length 5, so it
is highlighted as the second arc.
The next arc, DF with length 6, is highlighted using much the same
method.
http://en.wikipedia.org/wiki/File:Prim_Algorithm_0.svghttp://en.wikipedia.org/wiki/Arbitraryhttp://en.wikipedia.org/wiki/Arbitraryhttp://en.wikipedia.org/wiki/Arbitraryhttp://en.wikipedia.org/wiki/File:Kruskal_Algorithm_3.svghttp://en.wikipedia.org/wiki/File:Kruskal_Algorithm_2.svghttp://en.wikipedia.org/wiki/File:Kruskal_Algorithm_1.svghttp://en.wikipedia.org/wiki/File:Prim_Algorithm_0.svghttp://en.wikipedia.org/wiki/File:Kruskal_Algorithm_3.svghttp://en.wikipedia.org/wiki/File:Kruskal_Algorithm_2.svghttp://en.wikipedia.org/wiki/File:Kruskal_Algorithm_1.svghttp://en.wikipedia.org/wiki/File:Prim_Algorithm_0.svghttp://en.wikipedia.org/wiki/File:Kruskal_Algorithm_3.svghttp://en.wikipedia.org/wiki/File:Kruskal_Algorithm_2.svghttp://en.wikipedia.org/wiki/File:Kruskal_Algorithm_1.svghttp://en.wikipedia.org/wiki/File:Prim_Algorithm_0.svghttp://en.wikipedia.org/wiki/File:Kruskal_Algorithm_3.svghttp://en.wikipedia.org/wiki/File:Kruskal_Algorithm_2.svghttp://en.wikipedia.org/wiki/File:Kruskal_Algorithm_1.svghttp://en.wikipedia.org/wiki/File:Prim_Algorithm_0.svghttp://en.wikipedia.org/wiki/Arbitrary -
8/4/2019 Kruskal 2
5/16
The next-shortest arcs are AB and BE, both with length 7. AB is chosen
arbitrarily, and is highlighted. The arc BD has been highlighted in red,because there already exists a path (in green) between B and D, so it
would form a cycle (ABD) if it were chosen.
The process continues to highlight the next-smallest arc, BE with length
7. Many more arcs are highlighted in red at this stage: BC because itwould form the loopBCE, DE because it would form the loop DEBA,
and FE because it would form FEBAD.
Finally, the process finishes with the arc EG of length 9, and the
minimum spanning tree is found.
[edit]Proof of correctness
The proof consists of two parts. First, it is proved that the algorithm produces a spanning tree. Second, it is
proved that the constructed spanning tree is of minimal weight.
[edit]Spanning tree
Let P be a connected, weighted graph and let Ybe the subgraph of P produced by the algorithm. Ycannot
have a cycle, since the last edge added to that cycle would have been within one subtree and not between
two different trees. Ycannot be disconnected, since the first encountered edge that joins two components
of Ywould have been added by the algorithm. Thus, Yis a spanning tree of P.
[edit]Minimality
We show that the following proposition P is true byinduction: If Fis the set of edges chosen at any stage of
the algorithm, then there is some minimum spanning tree that contains F.
http://en.wikipedia.org/wiki/File:Kruskal_Algorithm_4.svghttp://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=5http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=5http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=5http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=6http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=6http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=6http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=7http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=7http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=7http://en.wikipedia.org/wiki/Mathematical_inductionhttp://en.wikipedia.org/wiki/Mathematical_inductionhttp://en.wikipedia.org/wiki/Mathematical_inductionhttp://en.wikipedia.org/wiki/File:Kruskal_Algorithm_6.svghttp://en.wikipedia.org/wiki/File:Kruskal_Algorithm_5.svghttp://en.wikipedia.org/wiki/File:Kruskal_Algorithm_4.svghttp://en.wikipedia.org/wiki/File:Kruskal_Algorithm_6.svghttp://en.wikipedia.org/wiki/File:Kruskal_Algorithm_5.svghttp://en.wikipedia.org/wiki/File:Kruskal_Algorithm_4.svghttp://en.wikipedia.org/wiki/File:Kruskal_Algorithm_6.svghttp://en.wikipedia.org/wiki/File:Kruskal_Algorithm_5.svghttp://en.wikipedia.org/wiki/File:Kruskal_Algorithm_4.svghttp://en.wikipedia.org/wiki/Mathematical_inductionhttp://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=7http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=6http://en.wikipedia.org/w/index.php?title=Kruskal%27s_algorithm&action=edit§ion=5 -
8/4/2019 Kruskal 2
6/16
Clearly P is true at the beginning, when Fis empty: any minimum spanning tree will do, and there
exists one because a weighted connected graph always has a minimum spanning tree.
Now assume P is true for some non-final edge set Fand let Tbe a minimum spanning tree that
contains F. If the next chosen edge eis also in T, then P is true for F+ e. Otherwise, T+ ehas a
cycle Cand there is another edge fthat is in Cbut not F. (If there were no such edge f, then ecould
not have been added to F, since doing so would have created the cycle C.) Then Tf+ eis a tree,
and it has the same weight as T, since Thas minimum weight and the weight of fcannot be less than
the weight of e, otherwise the algorithm would have chosen finstead of e. So Tf+ eis a minimum
spanning tree containing F+ eand again Pholds.
Therefore, by the principle of induction, Pholds when Fhas become a spanning tree, which is only
possible if Fis a minimum spanning tree itself.
-
8/4/2019 Kruskal 2
7/16
Kruskal's Algorithm
This minimum spanning tree algorithm was first described byKruskal in 1956 in the same paper where he rediscovered
Jarnik's algorithm. This algorithm was also rediscovered in
1957 by Loberman and Weinberger, but somehow avoided
being renamed after them. The basic idea of the Kruskal's
algorithms is as follows: scan all edges in increasing weight
order; if an edge is safe, keep it (i.e. add it to the set A).
Overall Strategy
Kruskal's Algorithm, as described in CLRS, is directly based
on the generic MST algorithm. It builds the MST in forest.
Initially, each vertex is in its own tree in forest. Then,algorithm consider each edge in turn, order by increasing
weight. If an edge (u, v) connects two different trees, then
(u, v) is added to the set of edges of the MST, and two trees
connected by an edge (u, v) are merged into a single tree on
the other hand, if an edge (u, v) connects two vertices in the
same tree, then edge (u, v) is discarded.
A little more formally, given a connected, undirected,
weighted graph with a function w: E R.
Starts with each vertex being its own component.
Repeatedly merges two components into one by
choosing the light edge that connects them (i.e., the light
edge crossing the cut between them).
-
8/4/2019 Kruskal 2
8/16
Scans the set of edges in monotonically increasing order
by weight.
Uses a disjoint-set data structure to determine whether an
edge connects vertices in different components.
Data Structure
Before formalizing the above idea, lets quickly review the
disjoint-set data structure from Chapter 21.
Make_SET(v): Create a new set whose only member is
pointed to by v. Note that for this operation v must
already be in a set.
FIND_SET(v): Returns a pointer to the set
containing v.
UNION(u, v): Unites the dynamic sets that
contain u and v into a new set that is union of these two
sets.
Algorithm
Start with an empty set A, and select at every stage the
shortest edge that has not been chosen or rejected, regardless
of where this edge is situated in the graph.
KRUSKAL(V, E, w)
A { } Set A will ultimately contains the edges
of the MST
for each vertex v in V
do MAKE-SET(v)
sort E into nondecreasing order by weight w
-
8/4/2019 Kruskal 2
9/16
for each (u, v) taken from the sorted list
do ifFIND-SET(u) = FIND-SET(v)
thenA A {(u, v)}
UNION(u, v)return A
Illustrative Examples
Lets run through the following graph quickly to see how
Kruskal's algorithm works on it:
We get the shaded edges shown in the above figure.
Edge (c, f) : safe
Edge (g, i) : safe
Edge (e, f) : safe
Edge (c, e) : reject
Edge (d, h) : safe
Edge (f, h) : safeEdge (e, d) : reject
Edge (b, d) : safe
Edge (d, g) : safe
Edge (b, c) : reject
Edge (g, h) : reject
Edge (a, b) : safe
-
8/4/2019 Kruskal 2
10/16
At this point, we have only one component, so all other edges
will be rejected. [We could add a test to the main loop of
KRUSKAL to stop once |V| 1 edges have been added to A.]
Note Carefully: Suppose we had examined (c, e) before (e, f
). Then would have found (c, e) safe and would have rejected
(e, f ).
Example (CLRS) Step-by-Step Operation of Kurskal's
Algorithm.
Step 1. In the graph, the Edge(g, h) is shortest. Either
vertex g or vertex h could be representative. Lets choose
vertex g arbitrarily.
Step 2. The edge (c, i) creates the second tree. Choose
vertex c as representative for second tree.
Step 3. Edge (g, g) is the next shortest edge. Add this edge
and choose vertex g as representative.
-
8/4/2019 Kruskal 2
11/16
Step 4. Edge (a, b) creates a third tree.
Step 5. Add edge (c, f) and merge two trees. Vertex c is
chosen as the representative.
Step 6. Edge (g, i) is the next next cheapest, but if we add this
edge a cycle would be created. Vertex c is the representative
of both.
Step 7. Instead, add edge (c, d).
-
8/4/2019 Kruskal 2
12/16
Step 8. If we add edge (h, i), edge(h, i) would make a cycle.
Step 9. Instead of adding edge (h, i) add edge (a, h).
Step 10. Again, if we add edge (b, c), it would create a cycle.
Add edge (d, e) instead to complete the spanning tree. In this
spanning tree all trees joined and vertex c is a sole
representative.
Analysis
-
8/4/2019 Kruskal 2
13/16
Initialize the set A: O(1)
First for loop: |V| MAKE-SETs
Sort E: O(E lg E)
Second for loop: O(E) FIND-SETs and UNIONs
Assuming the implementation of disjoint-set data
structure, already seen in Chapter 21, that uses union by
rank and path compression: O((V + E) (V)) + O(E lg E)
Since G is connected, |E| |V| 1O(E (V)) + O(E lg
E).
(|V|) = O(lg V) = O(lg E).
Therefore, total time is O(E lg E).
|E| |V|2lg |E| = O(2 lg V) = O(lg V).
Therefore, O(E lg V) time. (If edges are already sorted,
O(E (V)), which is almost linear.)
II Kruskal's Algorithm Implemented with Priority
Queue Data Structure
MST_KRUSKAL(G)
for each vertex v in V[G]
dodefine set S(v) {v}
Initialize priority queue Q that contains all edges of G,
using the weights as keys
A { } A will ultimately contains
the edges of the MST
-
8/4/2019 Kruskal 2
14/16
while A has less than n 1 edges
do Let set S(v) contains v and S(u) contain u
ifS(v) S(u)
then Add edge (u, v) to AMerge S(v) and S(u) into one set i.e., union
return A
Analysis
The edge weight can be compared in constant time.Initialization of priority queue takes O(E lg E) time by
repeated insertion. At each iteration of while-loop, minimum
edge can be removed in O(log E) time, which is O(log V),
since graph is simple. The total running time is O((V + E) log
V), which is O(E lg V) since graph is simple and connected.
-
8/4/2019 Kruskal 2
15/16
Kruskal AlgorithmThe input is a connected weighted graph G withy; vertices.
Step 1. Arrange the edges of G in order of increasing weights.
Step 2. Starting only with the vertices of G and proceeding sequentially, add each edge which doesnot result in a cycle until n 1 edges are added.
Step 3. Exit.
The weight of a minimal spanning tree is unique, but the minimal spanning tree itself is not. Different
minimal spanning trees can occur when two or more edges have the same weight. In such a case, the
arrangement of the edges in Step 1 of Algorithms 1.8A or 1.8B is not unique and hence may result in
different minimal spanning trees as illustrated in the following example.
Example 1.1
Find a minimal spanning tree of the weighted graph Q in Figure (a). Note that Q has six vertices, so a
minimal spanning tree will have five edges.
(a) Here we apply Algorithm 1.8A.
First we order the edges by decreasing weights, and then we successively delete edge, without
disconnecting Q until five edges remain. This yields the following data:
Edges BC AF AC BE CE BF AE DF BDWeight 8 7 7 7 6 5 4 4 3Delete? Yes Yes Yes No No Yes
Thus the minimal spanning tree of Q which is obtained contains the edges.
BE, CE, AE, DF, BD
The spanning tree has weight 24 and it is shown in Figure (b).
(b) Here we apply Kruskal Algorithm.
First we order the edges by increasing weights, and then we successively add edges without
formingany cycles five edges are included. This yields the following data:
Edges BD AE DF BF CE AC AF BE ECWeight 3 4 4 5 6 7 7 7 8Delete? Yes Yes Yes No Yes No Yes
Thus the minimal spanning tree of Q which is obtained contains the edges
BD, AE, DF, CE, AF
-
8/4/2019 Kruskal 2
16/16
The spanning tree appears in Figure (c). Observethat this spanning tree is not the same as the one
obtained using Algorithm 1.8A.
Remark: The above algorithms are easily executed when the graph G is relatively small as in Fig.1-
19(a). Suppose C has dozens of vertices and hundreds of edges which, say. are given by a
listofvertices. Then even deciding whether G is connected is not obvious- it nay require some type i-first search (DFS) or breadth-first search (BFS) graph algorithm. Later sections and the next will
discuss ways ofrepresenting graphs G in memory' and will discuss various graph algorithms.