SlideShare a Scribd company logo
Design and Analysis of Algorithms
LECTURE 1
Analysis of Algorithms
• Insertion sort
• Asymptotic analysis
• Merge sort
• Recurrences
Analysis of algorithms
The theoretical study of computer-program
performance and resource usage.
What’s more important than performance?
• modularity
• correctness
• maintainability
• functionality
• robustness
• user-friendliness
• programmer time
• simplicity
• extensibility
• reliability
Why study algorithms and
performance?
• Algorithms help us to understand scalability.
• Performance often draws the line between what
is feasible and what is impossible.
• Algorithmic mathematics provides a language
for talking about program behavior.
• Performance is the currency of computing.
The problem of sorting
Input: sequence a1, a2, …, an of numbers.
Output: permutation a'1, a'2, …, a'n such
that a'1  a'2  …  a'n .
Example:
Input: 8 2 4 9 3 6
Output: 2 3 4 6 8 9
Insertion sort
⊳A[1 . . n]
INSERTION-SORT (A, n)
for j ← 2 to n
do key ← A[ j]
i ← j – 1
while i > 0 and A[i] > key
do A[i+1] ← A[i]
i ← i – 1
A[i+1] = key
“pseudocode”
Insertion sort
⊳A[1 . . n]
INSERTION-SORT (A, n)
for j ← 2 to n
do key ← A[ j]
i ← j – 1
“pseudocode”
sorted
i j
key
A:
1
while i > 0 and A[i] > key
do A[i+1] ← A[i]
i ← i – 1
A[i+1] = key
n
Example of insertion sort
8 2 4 9 3 6
Example of insertion sort
8 2 4 9 3 6
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
2 3 4 8 9 6
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
2 3 4 8 9 6
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
2 3 4 8 9 6
2 3 4 6 8 9 done
Running time
• The running time depends on the input: an
already sorted sequence is easier to sort.
• Parameterize the running time by the size of
the input, since short sequences are easier to
sort than long ones.
• Generally, we seek upper bounds on the
running time, because everybody likes a
guarantee.
Kinds of analyses
Worst-case: (usually)
• T(n) = maximum time of algorithm
on any input of size n.
Average-case: (sometimes)
• T(n) = expected time of algorithm
over all inputs of size n.
• Need assumption of statistical
distribution of inputs.
Best-case: (bogus)
• Cheat with a slow algorithm that
works fast on some input.
Machine-independent time
What is insertion sort’s worst-case time?
• It depends on the speed of our computer:
• relative speed (on the same machine),
• absolute speed (on different machines).
BIG IDEA:
• Ignore machine-dependent constants.
• Look at growth of T(n) as n → ∞ .
“Asymptotic Analysis”
-notation
Math:
(g(n)) = { f (n) : there exist positive constants c1, c2, and
n0 such that 0  c1 g(n)  f (n)  c2 g(n)
for all n  n0}
Engineering:
• Drop low-order terms; ignore leading constants.
• Example: 3n3 + 90n2 – 5n + 6046 = (n3)
Asymptotic performance
• Real-world design
situations often call for a
careful balancing of
engineering objectives.
• Asymptotic analysis is a
useful tool to help to
structure our thinking.
When n gets large enough, a (n2) algorithm
always beats a (n3) algorithm.
• We shouldn’t ignore
asymptotically slower
algorithms, however.
Insertion sort analysis
Worst case: Input reverse sorted.
n
T(n)  ( j)  n2
n
[arithmetic series]
j2
Average case: All permutations equally likely.
T(n)  ( j /2)  n2
j2
Is insertion sort a fast sorting algorithm?
• Moderately so, for small n.
• Not at all, for large n.
Merge sort
MERGE-SORT A[1 . .n]
1. If n = 1, done.
2. Recursively sort A[ 1 . . n/2]
and A[ n/2+1. . n ].
3. “Merge” the 2 sorted lists.
Key subroutine: MERGE
Merging two sorted arrays
20 12
13 11
7 9
2 1
Merging two sorted arrays
20 12
13 11
7 9
2 1
1
Merging two sorted arrays
20 12 20 12
13 11 13 11
7 9 7 9
2 1 2
1
Merging two sorted arrays
20 12 20 12
13 11 13 11
7 9 7 9
2 1 2
1 2
Merging two sorted arrays
20 12 20 12 20 12
13 11 13 11 13 11
7 9 7 9 7 9
2 1 2
1 2
Merging two sorted arrays
20 12 20 12 20 12
13 11 13 11 13 11
7 9 7 9 7 9
2 1 2
1 2 7
Merging two sorted arrays
20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2
1 2 7
Merging two sorted arrays
20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2
1 2 7 9
Merging two sorted arrays
20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2
1 2 7 9
Merging two sorted arrays
1 2 7 9
20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2
11
Merging two sorted arrays
1 2 7 9
20 12 20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11 13
7 9 7 9 7 9 9
2 1 2
11
Merging two sorted arrays
1 2 7 9 11 12
20 12 20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11 13
7 9 7 9 7 9 9
2 1 2
Merging two sorted arrays
1 12
20 12 20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11 13
7 9 7 9 7 9 9
2 1 2
2 7 9 11
Time = (n) to merge a total
of n elements (linear time).
Analyzing merge sort
MERGE-SORT A[1 . . n]
1. If n = 1, done.
2. Recursively sort A[ 1 . . n/2]
3
. and A[ n/2+1. . n ].
4. “Merge” the 2 sorted lists
T(n)
(1)
2T(n/2)
(n)
Abuse
Sloppiness: Should be T( n/2) + T( n/2), but it
turns out not to matter asymptotically.
Recurrence for merge sort
T(n) =
(1) if n = 1;
2T(n/2) + (n) if n > 1.
• We shall usually omit stating the base
case when T(n) = (1) for sufficiently
small n, but only when it has no effect on
the asymptotic solution to the recurrence.
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
T(n)
Recursion tree
T(n/2) T(n/2)
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
T(n/4) T(n/4) T(n/4) T(n/4)
cn/2 cn/2
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn/4 cn/4 cn/4 cn/4
cn/2 cn/2
(1)
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn/4 cn/4 cn/4 cn/4
cn/2 cn/2
(1)
h = lg n
Recursion tree
cn/4 cn/4 cn/4 cn/4
cn/2 cn/2
(1)
h = lg n
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn cn
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn/4 cn/4 cn/4 cn/4
cn/2
(1)
h = lg n
cn cn
cn/2 cn
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn/4 cn/4
cn/2
(1)
h = lg n
cn cn
cn/2 cn
cn/4 cn/4 cn
…
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn/4 cn/4
cn/2
(1)
h = lg n
cn cn
cn/2 cn
cn/4 cn/4 cn
#leaves = n (n)
…
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn/4 cn/4
cn/2
(1)
h = lg n
cn cn
cn/2 cn
cn/4 cn/4 cn
#leaves = n (n)
…
Total  (n lg n)
Conclusions
• (n lg n) grows more slowly than (n2).
• Therefore, merge sort asymptotically
beats insertion sort in the worst case.
• In practice, merge sort beats insertion
sort for n > 30 or so.

More Related Content

PPT
l1.ppt
PPT
l1.ppt
PPT
Welcome to Introduction to Algorithms, Spring 2004
PPT
course information of design analysis of alg
PDF
Introduction of Algorithm.pdf
PPT
1-Chapter One - Analysis of Algorithms.ppt
PPT
PDF
01 analysis-of-algorithms
l1.ppt
l1.ppt
Welcome to Introduction to Algorithms, Spring 2004
course information of design analysis of alg
Introduction of Algorithm.pdf
1-Chapter One - Analysis of Algorithms.ppt
01 analysis-of-algorithms

Similar to Algorithim lec1.pptx (20)

PPT
Introduction to Algorithms- Design and analysis of Algorithms
PDF
Alg_Wks1_2.pdflklokjbhvkv jv .v.vk.hk kv h/k
PPT
MITP Ch 2jbhjbhjbhjbhjbhjb Sbhjblides.ppt
PPTX
ch16.pptx
PPTX
ch16 (1).pptx
PPT
Introduction
PPT
02_Gffdvxvvxzxzczcczzczcczczczxvxvxvds2.ppt
PPT
Algorithms and Data structures: Merge Sort
PPTX
Lecture 4 (1).pptx
PPT
Algorithm Design and Analysis
PPT
Algorithm.ppt
PPT
MergesortQuickSort.ppt
PPT
presentation_mergesortquicksort_1458716068_193111.ppt
PPT
Mergesort
PPTX
Algorithms - Rocksolid Tour 2013
PPT
Insert Sort & Merge Sort Using C Programming
PDF
An Experiment to Determine and Compare Practical Efficiency of Insertion Sort...
PPTX
Merge sort and quick sort
PDF
Introduction to Algorithms- Design and analysis of Algorithms
Alg_Wks1_2.pdflklokjbhvkv jv .v.vk.hk kv h/k
MITP Ch 2jbhjbhjbhjbhjbhjb Sbhjblides.ppt
ch16.pptx
ch16 (1).pptx
Introduction
02_Gffdvxvvxzxzczcczzczcczczczxvxvxvds2.ppt
Algorithms and Data structures: Merge Sort
Lecture 4 (1).pptx
Algorithm Design and Analysis
Algorithm.ppt
MergesortQuickSort.ppt
presentation_mergesortquicksort_1458716068_193111.ppt
Mergesort
Algorithms - Rocksolid Tour 2013
Insert Sort & Merge Sort Using C Programming
An Experiment to Determine and Compare Practical Efficiency of Insertion Sort...
Merge sort and quick sort
Ad

Recently uploaded (20)

DOCX
573137875-Attendance-Management-System-original
PPTX
web development for engineering and engineering
PDF
Structs to JSON How Go Powers REST APIs.pdf
PDF
Evaluating the Democratization of the Turkish Armed Forces from a Normative P...
PPTX
bas. eng. economics group 4 presentation 1.pptx
PDF
Mitigating Risks through Effective Management for Enhancing Organizational Pe...
PPTX
Engineering Ethics, Safety and Environment [Autosaved] (1).pptx
PDF
Well-logging-methods_new................
PPTX
UNIT-1 - COAL BASED THERMAL POWER PLANTS
PDF
Model Code of Practice - Construction Work - 21102022 .pdf
PPTX
Foundation to blockchain - A guide to Blockchain Tech
PPTX
CYBER-CRIMES AND SECURITY A guide to understanding
PDF
The CXO Playbook 2025 – Future-Ready Strategies for C-Suite Leaders Cerebrai...
PPTX
OOP with Java - Java Introduction (Basics)
PDF
Embodied AI: Ushering in the Next Era of Intelligent Systems
PPTX
MCN 401 KTU-2019-PPE KITS-MODULE 2.pptx
PDF
Digital Logic Computer Design lecture notes
PPTX
KTU 2019 -S7-MCN 401 MODULE 2-VINAY.pptx
PPTX
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
PPTX
Lecture Notes Electrical Wiring System Components
573137875-Attendance-Management-System-original
web development for engineering and engineering
Structs to JSON How Go Powers REST APIs.pdf
Evaluating the Democratization of the Turkish Armed Forces from a Normative P...
bas. eng. economics group 4 presentation 1.pptx
Mitigating Risks through Effective Management for Enhancing Organizational Pe...
Engineering Ethics, Safety and Environment [Autosaved] (1).pptx
Well-logging-methods_new................
UNIT-1 - COAL BASED THERMAL POWER PLANTS
Model Code of Practice - Construction Work - 21102022 .pdf
Foundation to blockchain - A guide to Blockchain Tech
CYBER-CRIMES AND SECURITY A guide to understanding
The CXO Playbook 2025 – Future-Ready Strategies for C-Suite Leaders Cerebrai...
OOP with Java - Java Introduction (Basics)
Embodied AI: Ushering in the Next Era of Intelligent Systems
MCN 401 KTU-2019-PPE KITS-MODULE 2.pptx
Digital Logic Computer Design lecture notes
KTU 2019 -S7-MCN 401 MODULE 2-VINAY.pptx
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
Lecture Notes Electrical Wiring System Components
Ad

Algorithim lec1.pptx

  • 1. Design and Analysis of Algorithms LECTURE 1 Analysis of Algorithms • Insertion sort • Asymptotic analysis • Merge sort • Recurrences
  • 2. Analysis of algorithms The theoretical study of computer-program performance and resource usage. What’s more important than performance? • modularity • correctness • maintainability • functionality • robustness • user-friendliness • programmer time • simplicity • extensibility • reliability
  • 3. Why study algorithms and performance? • Algorithms help us to understand scalability. • Performance often draws the line between what is feasible and what is impossible. • Algorithmic mathematics provides a language for talking about program behavior. • Performance is the currency of computing.
  • 4. The problem of sorting Input: sequence a1, a2, …, an of numbers. Output: permutation a'1, a'2, …, a'n such that a'1  a'2  …  a'n . Example: Input: 8 2 4 9 3 6 Output: 2 3 4 6 8 9
  • 5. Insertion sort ⊳A[1 . . n] INSERTION-SORT (A, n) for j ← 2 to n do key ← A[ j] i ← j – 1 while i > 0 and A[i] > key do A[i+1] ← A[i] i ← i – 1 A[i+1] = key “pseudocode”
  • 6. Insertion sort ⊳A[1 . . n] INSERTION-SORT (A, n) for j ← 2 to n do key ← A[ j] i ← j – 1 “pseudocode” sorted i j key A: 1 while i > 0 and A[i] > key do A[i+1] ← A[i] i ← i – 1 A[i+1] = key n
  • 7. Example of insertion sort 8 2 4 9 3 6
  • 8. Example of insertion sort 8 2 4 9 3 6
  • 9. Example of insertion sort 8 2 4 9 3 6 2 8 4 9 3 6
  • 10. Example of insertion sort 8 2 4 9 3 6 2 8 4 9 3 6
  • 11. Example of insertion sort 8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6
  • 12. Example of insertion sort 8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6
  • 13. Example of insertion sort 8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6 2 4 8 9 3 6
  • 14. Example of insertion sort 8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6 2 4 8 9 3 6
  • 15. Example of insertion sort 8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6 2 4 8 9 3 6 2 3 4 8 9 6
  • 16. Example of insertion sort 8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6 2 4 8 9 3 6 2 3 4 8 9 6
  • 17. Example of insertion sort 8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6 2 4 8 9 3 6 2 3 4 8 9 6 2 3 4 6 8 9 done
  • 18. Running time • The running time depends on the input: an already sorted sequence is easier to sort. • Parameterize the running time by the size of the input, since short sequences are easier to sort than long ones. • Generally, we seek upper bounds on the running time, because everybody likes a guarantee.
  • 19. Kinds of analyses Worst-case: (usually) • T(n) = maximum time of algorithm on any input of size n. Average-case: (sometimes) • T(n) = expected time of algorithm over all inputs of size n. • Need assumption of statistical distribution of inputs. Best-case: (bogus) • Cheat with a slow algorithm that works fast on some input.
  • 20. Machine-independent time What is insertion sort’s worst-case time? • It depends on the speed of our computer: • relative speed (on the same machine), • absolute speed (on different machines). BIG IDEA: • Ignore machine-dependent constants. • Look at growth of T(n) as n → ∞ . “Asymptotic Analysis”
  • 21. -notation Math: (g(n)) = { f (n) : there exist positive constants c1, c2, and n0 such that 0  c1 g(n)  f (n)  c2 g(n) for all n  n0} Engineering: • Drop low-order terms; ignore leading constants. • Example: 3n3 + 90n2 – 5n + 6046 = (n3)
  • 22. Asymptotic performance • Real-world design situations often call for a careful balancing of engineering objectives. • Asymptotic analysis is a useful tool to help to structure our thinking. When n gets large enough, a (n2) algorithm always beats a (n3) algorithm. • We shouldn’t ignore asymptotically slower algorithms, however.
  • 23. Insertion sort analysis Worst case: Input reverse sorted. n T(n)  ( j)  n2 n [arithmetic series] j2 Average case: All permutations equally likely. T(n)  ( j /2)  n2 j2 Is insertion sort a fast sorting algorithm? • Moderately so, for small n. • Not at all, for large n.
  • 24. Merge sort MERGE-SORT A[1 . .n] 1. If n = 1, done. 2. Recursively sort A[ 1 . . n/2] and A[ n/2+1. . n ]. 3. “Merge” the 2 sorted lists. Key subroutine: MERGE
  • 25. Merging two sorted arrays 20 12 13 11 7 9 2 1
  • 26. Merging two sorted arrays 20 12 13 11 7 9 2 1 1
  • 27. Merging two sorted arrays 20 12 20 12 13 11 13 11 7 9 7 9 2 1 2 1
  • 28. Merging two sorted arrays 20 12 20 12 13 11 13 11 7 9 7 9 2 1 2 1 2
  • 29. Merging two sorted arrays 20 12 20 12 20 12 13 11 13 11 13 11 7 9 7 9 7 9 2 1 2 1 2
  • 30. Merging two sorted arrays 20 12 20 12 20 12 13 11 13 11 13 11 7 9 7 9 7 9 2 1 2 1 2 7
  • 31. Merging two sorted arrays 20 12 20 12 20 12 20 12 13 11 13 11 13 11 13 11 7 9 7 9 7 9 9 2 1 2 1 2 7
  • 32. Merging two sorted arrays 20 12 20 12 20 12 20 12 13 11 13 11 13 11 13 11 7 9 7 9 7 9 9 2 1 2 1 2 7 9
  • 33. Merging two sorted arrays 20 12 20 12 20 12 20 12 20 12 13 11 13 11 13 11 13 11 13 11 7 9 7 9 7 9 9 2 1 2 1 2 7 9
  • 34. Merging two sorted arrays 1 2 7 9 20 12 20 12 20 12 20 12 20 12 13 11 13 11 13 11 13 11 13 11 7 9 7 9 7 9 9 2 1 2 11
  • 35. Merging two sorted arrays 1 2 7 9 20 12 20 12 20 12 20 12 20 12 20 12 13 11 13 11 13 11 13 11 13 11 13 7 9 7 9 7 9 9 2 1 2 11
  • 36. Merging two sorted arrays 1 2 7 9 11 12 20 12 20 12 20 12 20 12 20 12 20 12 13 11 13 11 13 11 13 11 13 11 13 7 9 7 9 7 9 9 2 1 2
  • 37. Merging two sorted arrays 1 12 20 12 20 12 20 12 20 12 20 12 20 12 13 11 13 11 13 11 13 11 13 11 13 7 9 7 9 7 9 9 2 1 2 2 7 9 11 Time = (n) to merge a total of n elements (linear time).
  • 38. Analyzing merge sort MERGE-SORT A[1 . . n] 1. If n = 1, done. 2. Recursively sort A[ 1 . . n/2] 3 . and A[ n/2+1. . n ]. 4. “Merge” the 2 sorted lists T(n) (1) 2T(n/2) (n) Abuse Sloppiness: Should be T( n/2) + T( n/2), but it turns out not to matter asymptotically.
  • 39. Recurrence for merge sort T(n) = (1) if n = 1; 2T(n/2) + (n) if n > 1. • We shall usually omit stating the base case when T(n) = (1) for sufficiently small n, but only when it has no effect on the asymptotic solution to the recurrence.
  • 40. Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
  • 41. Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. T(n)
  • 42. Recursion tree T(n/2) T(n/2) Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn
  • 43. Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn T(n/4) T(n/4) T(n/4) T(n/4) cn/2 cn/2
  • 44. Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn/4 cn/4 cn/4 cn/4 cn/2 cn/2 (1)
  • 45. Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn/4 cn/4 cn/4 cn/4 cn/2 cn/2 (1) h = lg n
  • 46. Recursion tree cn/4 cn/4 cn/4 cn/4 cn/2 cn/2 (1) h = lg n Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn
  • 47. Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn/4 cn/4 cn/4 cn/4 cn/2 (1) h = lg n cn cn cn/2 cn
  • 48. Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn/4 cn/4 cn/2 (1) h = lg n cn cn cn/2 cn cn/4 cn/4 cn …
  • 49. Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn/4 cn/4 cn/2 (1) h = lg n cn cn cn/2 cn cn/4 cn/4 cn #leaves = n (n) …
  • 50. Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn/4 cn/4 cn/2 (1) h = lg n cn cn cn/2 cn cn/4 cn/4 cn #leaves = n (n) … Total  (n lg n)
  • 51. Conclusions • (n lg n) grows more slowly than (n2). • Therefore, merge sort asymptotically beats insertion sort in the worst case. • In practice, merge sort beats insertion sort for n > 30 or so.