Decision-tree example Sort(a1, a2, a3) 1:2 =〈9,4,6) 2:3 1:3 123 1:3 213 4≤6 2:3 132 312 231 321 Each internal node is labeled i; i for i,jE(1,2, ..,n) The left subtree shows subsequent comparisons ifa s a The right subtree shows subsequent comparisons if a; 2a o 2001 by Charles E Leiserson Introduction to Algorithms Day 8 L5.6
© 2001 by Charles E. Leiserson Introduction to Algorithms Day 8 L5.6 Decision-tree example 1:21:2 2:3 2:3 123 123 1:31:3 132 132 312 312 1:3 1:3 213 213 2:32:3 231 231 321 321 Each internal node is labeled i:j for i, j ∈ {1, 2,…, n}. •The left subtree shows subsequent comparisons if ai ≤ aj. •The right subtree shows subsequent comparisons if ai ≥ aj. 4 ≤ 6 Sort 〈a1, a2, a3〉 = 〈 9, 4, 6 〉:
Decision-tree example Sort(a1, a2, a3) 1:2 =〈9,4,6) 2:3 1:3 123 1:3 213 2:3 132 312 231 321 4<6≤9 Each leaf contains a permutation〈π(1),2丌(2)2…,m(m)to indicate that the ordering an< (2) has been established o 2001 by Charles E Leiserson Introduction to Algorithms Day 8 L5.7
© 2001 by Charles E. Leiserson Introduction to Algorithms Day 8 L5.7 Decision-tree example 1:21:2 2:3 2:3 123 123 1:31:3 132 132 312 312 1:3 1:3 213 213 2:32:3 231 231 321 321 Each leaf contains a permutation 〈π(1), π(2),…, π(n)〉 to indicate that the ordering aπ(1) ≤ aπ(2) ≤ L ≤ aπ(n) has been established. 4 ≤ 6 ≤ 9 Sort 〈a1, a2, a3〉 = 〈 9, 4, 6 〉:
Decision-tree model a decision tree can model the execution of any comparison sort. One tree for each input size n View the algorithm as splitting whenever it compares two elements The tree contains the comparisons along all possible instruction traces The running time of the algorithm =the length of the path taken Worst-case running time=height of tree o 2001 by Charles E Leiserson Introduction to Algorithms Day 8 L5.8
© 2001 by Charles E. Leiserson Introduction to Algorithms Day 8 L5.8 Decision-tree model A decision tree can model the execution of any comparison sort: • One tree for each input size n. • View the algorithm as splitting whenever it compares two elements. • The tree contains the comparisons along all possible instruction traces. • The running time of the algorithm = the length of the path taken. • Worst-case running time = height of tree
Lower bound for decision tree sorting Theorem. Any decision tree that can sort n elements must have height Q2(nIgn) Proof. The tree must contain > n! leaves, since there are n! possible permutations. a height-h binary tree has≤2 h leaves.Thus,n!≤2 h≥lg(n!) (g is mono. increasing) Ig((nle(Stirling's formula nlgn-nIge 2(n Ig n o 2001 by Charles E Leiserson Introduction to Algorithms Day 8 L5.9
© 2001 by Charles E. Leiserson Introduction to Algorithms Day 8 L5.9 Lower bound for decisiontree sorting Theorem. Any decision tree that can sort n elements must have height Ω(n lg n). Proof. The tree must contain ≥ n! leaves, since there are n! possible permutations. A height-h binary tree has ≤ 2h leaves. Thus, n! ≤ 2h . ∴ h ≥ lg(n!) (lg is mono. increasing) ≥ lg ((n/e)n) (Stirling’s formula) = n lg n – n lg e = Ω(n lg n)
Lower bound for comparison sorting Corollary. Heapsort and merge sort are asymptotically optimal comparison sorting algorithms o 2001 by Charles E Leiserson Introduction to Algorithms Day 8 L5.10
© 2001 by Charles E. Leiserson Introduction to Algorithms Day 8 L5.10 Lower bound for comparison sorting Corollary. Heapsort and merge sort are asymptotically optimal comparison sorting algorithms