In the running time of Kruskal's algorithm, the time for sorting edges is mlogm when we use quick sort. But as descried in 'Algorithm Design', the max number of edges is smaller than the square of number of nodes, so sorting the edge costs at most mlogn time. I can't understand this statement as even in a complete graph, shouldn't the max number of edges be n(n-1)/2?

1

There are 1 best solutions below

0
Berthur On

You're right in that a complete graph has n(n-1)/2 edges. Since n(n-1)/2 = n^2/2 - n/2 < n^2, the statement you quote holds true.

If your source of confusion is "why did the author say less than n2 instead of n(n-1)/2", then it's likely because n2 is a simpler expression and a sufficient one to prove that paragraph is trying to say, namely that the cost of sorting is at most m log n.

If you prove that something holds whenever the number of edges is <= n2, then it will also hold in the real world, since the number of edges is <=n/(n-1)/2 <= n2.