algorithms analysis and design
DESCRIPTION
Algorithms analysis and design. BY Lecturer: Aisha Dawood. Recurrences. A recurrence is a function is defined in terms of: one or more base cases, and itself, with smaller arguments. Recurrences. T echnical issues: • Floors and ceilings • Exact vs. asymptotic functions - PowerPoint PPT PresentationTRANSCRIPT
![Page 1: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/1.jpg)
Algorithms analysis and design
BYLecturer: Aisha Dawood
![Page 2: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/2.jpg)
2
A recurrence is a function is defined in terms of: one or more base cases, and itself, with smaller arguments.
Recurrences
![Page 3: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/3.jpg)
3
Technical issues: • Floors and ceilings • Exact vs. asymptotic functions • Boundary conditionsExample: T (n) = 2T (n/2) + (n), with solution T (n) = (n lg n). • The boundary conditions are usually expressed as
T(n) = O(1) for sufficiently small n. • When we desire an exact, rather than an asymptotic,
solution, we need to deal with boundary conditions. • In practice, we just use asymptotics most of the time,
and we ignore boundary conditions.
Recurrences
![Page 4: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/4.jpg)
4
There are three ways of solving recurrences running time: Substitution method. Recursion tree method. Master method.
Recurrences
![Page 5: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/5.jpg)
5
Substitution method1. Guess the solution.2. Use induction to find the constants and show that the solution works.
1. Guess: T (n) = n lg n + n.2. Induction: Base case: n = 1 n lg n + n = 1 = T (n) (exact solution)⇒ Inductive step: Inductive hypothesis is that T (k) = k lg k + k
for all k < n. We’ll use this inductive hypothesis for T (n/2).
Recurrences
![Page 6: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/6.jpg)
6
This is an exact solution.
Recurrences
![Page 7: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/7.jpg)
7
Generally, we use asymptotic notation:• We would write T (n) = 2T (n/2) + (n).• We assume T (n) = O(1) for sufficiently small n.• We express the solution by asymptotic notation: T (n) = (n log n).• We don’t worry about boundary cases, nor do we show base cases in the substitution proof.
For the substitution method:• Name the constant in the additive term.• Show the upper (O) and lower (Ω) bounds separately. Might need to use different constants for each.
Recurrences
![Page 8: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/8.jpg)
8
Example: T (n) = 2T (n/2)+ (n). If we want to show an upper bound of T (n) = 2T (n/2) + O(n), we write T (n) ≤ 2T(n/2) + cn for some positive constant c.
1. Upper bound: Guess: T (n) ≤ dn log n for some positive constant d. Substitution:
Recurrences
![Page 9: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/9.jpg)
9
2. Lower bound: Write T (n) ≥ 2T (n/2) + cn for some positive constant c.
Guess: T (n) ≥ dn log n for some positive constant d. Substitution:
Recurrences
![Page 10: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/10.jpg)
10
There is no general way to guess the correct solutions to recurrences. Guessing a solution takes experience and, occasionally, creativity.
There are some heuristics that can help you become a good guesser (e.g. Recursion tree).
If a recurrence is similar to one you have seen before, then guessing a similar solution is reasonable. As an example, consider the recurrence
T (n) = 2T ( n/2 + 17) + n , we make the guess that T (n) = ⌊ ⌋O(n log n).
Making a good guess
![Page 11: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/11.jpg)
11
Another way to make a good guess is to prove loose upper and lower bounds on the recurrence and then reduce the range of uncertainty.
For example, we might start with a lower bound of T (n) = Ω(n) for the recurrence, since we have the term n in the recurrence, and we can prove an initial upper bound of T (n) = O(n2). Then, we can gradually lower the upper bound and raise the lower bound until we converge on the correct, asymptotically tight solution of T (n) = Θ(n log n).
Making a good guess
![Page 12: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/12.jpg)
12
Sometimes you guess at an asymptotic bound on the solution of a recurrence, but somehow the inductive doesn’t work.
This is solved through revising the guess by subtracting a lower-order term often permits the math to go through.
Example :
Subtleties
![Page 13: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/13.jpg)
13
As an example, consider the recurrence
We can simplify this recurrence, though, with a change of variables.
T (2m) = 2T (2m/2) + m. We can now rename S(m) = T(2m) to produce the new
recurrence S(m) = 2S(m/2) + m, This new recurrence has the same solution: S(m) = O(m lg m).
Changing back from S(m) to T (n), we obtain T (n) = T (2m) = S(m) = O(m lg m) = O(lg n lg lg n).
Changing variables
![Page 14: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/14.jpg)
14
Another way to make a good guess is to prove loose upper and lower bounds on the recurrence and then reduce the range of uncertainty.
For example, we might start with a lower bound of T (n) = Ω(n) for the recurrence, since we have the term n in the recurrence, and we can prove an initial upper bound of T (n) = O(n2). Then, we can gradually lower the upper bound and raise the lower bound until we converge on the correct, asymptotically tight solution of T (n) = Θ(n log n).
Making a good guess
![Page 15: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/15.jpg)
15
In a recursion tree, each node represents the cost of a single subproblem somewhere in the set of recursive function invocations. We sum the costs within each level of the tree to obtain a set of per-level costs, and then we sum all the per-level costs to determine the total cost of all levels of the recursion.
A recursion tree is best used to generate a good guess, which is then verified by the substitution method.
The recursion-tree method
![Page 16: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/16.jpg)
16
For example, let us see how a recursion tree would provide a good guess for the recurrence
T (n) = 3T ( n/4 ) + Θ(n⌊ ⌋ 2).
we create a recursion tree form the recurrence: T (n) = 3T(n/4) + cn2, the constant coefficient c > 0.
We get the recursion tree
The recursion-tree method
![Page 17: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/17.jpg)
17
T (n) = 3T(n/4) + cn2
![Page 18: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/18.jpg)
18
The subproblem size for a node at depth i is n/4i. Thus, the subproblem size hits n = 1 when n/4i = 1 or, equivalently, when i = log4 n. Thus, the tree has log 4n + 1 levels (0, 1, 2,..., log4 n).
Next we determine the cost at each level of the tree. Each level has three times more nodes than the level above, and so the number of nodes at depth i is 3i.
Because subproblem sizes reduce by a factor of 4 for each level we go down from the root, each node at depth i, for i = 0, 1, 2,..., log4 n - 1, has a cost of c(n/4i)2. Multiplying, we see that the total cost over all nodes at depth i, for i = 0, 1, 2,..., log4 n - 1, is 3i c(n/4i)2 = (3/16)i cn2.
The last level, at depth log4 n, has nodes 3log4
n = nlog4
3each contributing cost T (1), for a total cost of nlog
43T (1), which is
nlog4
3 .
The recursion-tree method
![Page 19: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/19.jpg)
19
Now we add up the costs over all levels to determine the cost for the entire tree:
The recursion-tree method
as an upper bound:
![Page 20: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/20.jpg)
20
Now we can use the substitution method to verify that our guess was correct, that is, T (n) = O(n2) is an upper bound for the recurrence T (n) = 3T (⌊n/4 )+Θ(⌋ n2).
We want to show that T (n) ≤ dn2 for some constant d > 0. Using the same constant c > 0 as before.
where the last step holds as long as d ≥ (16/3)c.
The recursion-tree method
T(n) ≤ 3T(⌊n/4⌋) + cn2 ≤ 3d⌊n/4⌋2 + cn2 ≤ 3d(n/4)2 + cn2 = 3/16 dn2 + cn2 ≤ dn2,
![Page 21: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/21.jpg)
21
Another example: the recursion tree for T (n) = T(n/3) + T(2n/3) + O(n).
The recursion-tree method
![Page 22: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/22.jpg)
22
Another example: the recursion tree for T (n) = T(n/3) + T(2n/3) + O(n).
The recursion-tree method
![Page 23: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/23.jpg)
23
The longest path from the root to a leaf is n → (2/3)n → (2/3)2n → ··· → 1. Since (2/3)kn = 1 when k = log3/2 n, the height of the tree is log3/2 n.
Intuitively, we expect the solution to the recurrence to O(cn log3/2 n) = O(n lg n).
Each level contributes ≤ cn.• Lower bound guess: ≥ dn log3 n = (n lg n) for some positive constant d.• Upper bound guess: ≤ dn log3/2 n = O(n lg n) for some positive constant d.• Then prove by substitution.
The recursion-tree method
![Page 24: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/24.jpg)
24
The recursion-tree method
![Page 25: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/25.jpg)
25
The recursion-tree method
![Page 26: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/26.jpg)
26
The master method provides a "cookbook" method for solving recurrences of the form
where a ≥ 1 and b > 1 are constants and f (n) is an asymptotically positive function. The master method requires memorization of three cases, but then the solution of many recurrences can be determined quite easily without paper and pencil.
The Master method
![Page 27: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/27.jpg)
27
(Master theorem): Compare
The Master method
![Page 28: Algorithms analysis and design](https://reader035.vdocuments.us/reader035/viewer/2022062305/568166db550346895ddafbf3/html5/thumbnails/28.jpg)
28