2. Master Theorem
• T(n) = a * T(n/b) + f(n)
• Where:
• a = number of subproblems
• b = factor by which the problem size is divided
• f(n) = cost of work outside the recursion
• We compare f(n) with n^log_b a to find the
dominant term.
3. Case 1: Recursion Dominates
• If f(n) = O(n^{log_b a - ε}) for some ε > 0
• Then: T(n) = Θ(n^{log_b a})
• Example:
• T(n) = 8T(n/2) + n
• ⇒ a = 8, b = 2, log_b a = 3
• ⇒ f(n) = O(n)
• ⇒ T(n) = Θ(n^3)
4. Case 2: Equal Work
• If f(n) = Θ(n^{log_b a})
• Then: T(n) = Θ(n^{log_b a} * log n)
• Example:
• T(n) = 2T(n/2) + n
• ⇒ a = 2, b = 2, log_b a = 1
• ⇒ f(n) = Θ(n)
• ⇒ T(n) = Θ(n log n)
5. Case 3: Outside Work Dominates
• If f(n) = Ω(n^{log_b a + ε}) for some ε > 0
• AND regularity condition holds:
• a * f(n/b) ≤ c * f(n) for some c < 1
• Then: T(n) = Θ(f(n))
• Example:
• T(n) = 2T(n/2) + n^2
• ⇒ a = 2, b = 2, log_b a = 1
6. What is ε (Epsilon)?
• ε is a small positive constant.
• Used to compare f(n) to n^{log_b a}:
• - Case 1: f(n) is a little smaller
• - Case 2: f(n) is the same
• - Case 3: f(n) is a little bigger
• It helps identify which part dominates in time
complexity.
7. What Does 'Equal Work' Mean?
• 'Equal work' means:
• Time spent in recursion ≈ Time spent outside
recursion
• Formally:
• f(n) = Θ(n^{log_b a})
• Example:
• T(n) = 2T(n/2) + n Both parts grow at the
⇒
8. Summary of All Cases
• Case 1: Recursion Dominates
• - f(n) = O(n^{log_b a - ε})
• - T(n) = Θ(n^{log_b a})
• Case 2: Equal Work
• - f(n) = Θ(n^{log_b a})
• - T(n) = Θ(n^{log_b a} log n)
• Case 3: Outside Work Dominates