Class Notes
(809,197)

Canada
(493,572)

University of Waterloo
(18,190)

Computer Science
(752)

CS 341
(26)

Forbes Burkowski
(15)

Lecture

# Lecture #4 - Recurrences and the Master Theorem Fall 2009

Unlock Document

University of Waterloo

Computer Science

CS 341

Forbes Burkowski

Fall

Description

Solving Recurrences
• Method 1: Recursion Trees
• Method 2: Master theorem
– Prove a general theorem covering a wide variety
of recurrences.
– Apply the theorem in “cookbook” fashion.
• Method 3: Guessing and Checking method
– Guess an answer.
– Prove it using induction.
Recursion Trees: Merge Sort (page 1)
– Merge sort: an analysis:
// Sort sequence A[l..r]
function merge_sort(l, r)
// Base case: 1 element is always sorted
if (l = r) then return;
m := (l + r)/2;
// We need to sort both subsequences l..m, m+1..r
merge_sort(l, m);
merge_sort(m + 1, r);
// Finally: merge the two sorted sequences
merge(l, m, r);
1 Recursion Trees: Merge Sort (page 2)
– In the previous code we used a merge function:
// Merge two sorted sequences l..m, m+1..r
function merge(l, m, r)
copy A[l..m] to L; L[m – l + 2] := infinity;
copy A[m + 1..r] to R; R[r – m + 1] := infinity;
i := 1; j := 1; k := l;
while (L[i] < infinity or R[j] < infinity) do
if L[i] <= R[j] then
A[k] := L[i];
i := i + 1; k := k + 1;
else
A[k] := R[j];
j := j + 1; k := k + 1;
Recursion Trees: Merge Sort (page 3)
– Given the recurrence for T(n):
0 if n =1
T n =
+ T +Θ(n) ifn >1
we display the computation in the form of a tree
(one node for each time the function is invoked).
– Label each node in the tree with the combine time
cost incurred at that node.
– Leaves are labeled with base case cost T(0).
– The sum of all costs (including the leaves) is the
total running time of the algorithm, so we sum up
all costs in a convenient way (usually level by level)
2 Recursion Trees: Merge Sort (page 4)
from T( ) an
from T(n/ ) a a from T n/ 2
( )
n n n n
a a a a
2 2 2 2
NO NO NO NO
• This is getting messy, so we use a sloppy form of the recurrence:
Recursion Trees: Merge Sort (page 5)
Work done at each level:
an → an
1+ logn levels
an a n → an
2 2
a a a a → an
4 4 4 4
sum: an (log n)
… … … …
b b b b … b b
logn
2 = n
3 Working With Recursion Trees
1. Figure out general pattern of time cost
labels.
2. Sum the first few levels.
3. Figure out the pattern of the level sums.
4. Compute the sum of all internal nodes.
5. Figure out the height of the tree.
6. Compute the number of the base case
nodes.
7. Add up all leaves of the tree.
8. Compute the final total of all the labels.
The Master Theorem
• The Master theorem gives the solution of
recurrences of the form: T(n)=a T(n/b) + f(n)
– There are three cases, depending on the
relationships among a, b, and f.
– The theorem is applicable even when the
recurrence involves the standard floors and
ceilings in the partition, so long as the sloppy
form looks like T(n)=a T(n/b) + f(n).
4 Statement of Master Theorem
• Theorem. Let a ≥ 1, b > 1 be constants. Let
T(n) be such that
T(n)=a T(n/b) + f(n)
where n/b denotes either ceiling(n/b) or
floor(n/b). Then:
Θ (nlogba) if f(n∈ O(n logba -)ε
log a log a
T(n) ∈ Θ (n b lg n) if f(n∈ Θ(n b )
log a+ε
Θ(f(n)) if f(n∈ ▯(n b )
and a f(n/b) ≤ c f(n)
Master Theorem
• There are cases not covered by this theorem.
– For example: f(n) ∈ Θ(n log n).
• Cases 1 and 2 obviously do not apply.
• Case 3 also does not apply because n log n is not in
▯(nloba +) = ▯(nx +) no matter how small ε > 0 is.
• Do not forget that cn will eventually get larger than log n if
ε > 0.
5 Using the Master Theorem
• Example: T(n)= 3T(n/2)+ Θ(n)
– from multiplication of long numbers
– Here: a = 3, b = 2, x = 2og 3 = 1.5849.
– Note: f(n) < cn for n large enough implies ∈ O(n∈.5849)- ε
and the first case holds.
• For example we could pick ε = .0849 t∈ O(n f).) ∈
Consequently: T(∈ Θ(n.58).
• Example: T(n) = T(n/2) + 1
– from binary search
– In this recursion: a = 1, b = 2, x = log 1 = 0, so n = 1, and
2
the second case hoxds, implying:
T(n) ∈ Θ(n log n)= Θ(log n).
Using the Master Theorem (cont.)
• Example: T(n) = 3T(n/3) + n 2
– Here: a = 3, b = 3, x = log 3 = 1, n = n, so the third
3 2
case holds implying, T(n) ∈ ∈ Θ(f(n)) = Θ(n ).
• It is possible that no case holds, for example:
– Consider: T(n) = 2T(n/2) + n log n
– Then: x = log 2 = 1.
1 – ε
– But: f(n) = n log n is not in O(n ), is not in Θ(n),
and is not in Ω (n1 +).
– So, we cannot apply the Master Theorem. ☹
6 Proof Sketch of Master Theorem
• See CLRS: Figure 4.3, p. 77:
– We assume n is a power of b; then floors/ceilings can be
removed.
– The text uses a recursion tree in which every internal node
has a children.
– Cost of root is f(n), of its children f(n/b), of their children
2
f(n/b ), and so on…
– Height of the tree is thusblog n.
th i i
– Cost at i level is a f(n/b).
lobn−1
T ( ) Θ(n loba)+ a f (n/b ) i
– Sum is: ∑
i=0
• Short hand version: T(n) = LEAVES(n) + LEVELS(n).
Proof Sketch of Master Theorem: Heavy Leaves
x – ε Where x = log a
• If f(n) ∈ O(n ) for some ε > 0: b
– We will show that the running time is dominated by
x
the LEAVES(n) term which is Θ(n ).
• Need to show that LEVELS(n) is the same order or a
lower-order term, i.e.: LEVELS(∈ O(LEAVES(n)).
• Doing this means we can ignore LEVELS(n) and hence we
will get T(∈ Θ(n ) = Θ(nloba) as required.
x
– So let us prove LEVELS(n)∈ ∈ O(n ):
log a - ε x – ε
• Using f(n)∈ O(n b ) = O(n ) that typifies case 1
we get:

More
Less
Related notes for CS 341