3 alg analysis 2 (f3)
TRANSCRIPT
-
7/29/2019 3 Alg Analysis 2 (f3)
1/71
Complexity Analysis of
Algorithms (2)
-
7/29/2019 3 Alg Analysis 2 (f3)
2/71
"Big Oh" - Upper Bounding Running Time
-
7/29/2019 3 Alg Analysis 2 (f3)
3/71
0, nnallforncgnf
RNngandRNnf ::
cng
nf
nlim0
Complexity Analysis:
Big-O Notation
Let fand gbe two functions such that
Formal Definition:
ngOnf
if there exists positive constants c and n0 such that
or if
So g(n) is an asymptotic upper-bound forf(n) as n increases
(g(n) bounds f(n) from above)
then we write
-
7/29/2019 3 Alg Analysis 2 (f3)
4/71
Complexity Analysis:
Big-O Notation
f(n) is O(g(n))
if f grows at most as fast as g.
Informal Definition:
Grow refers to the value off(n) as n increases
cg(n) is an approximation to f(n), bounding from above
-
7/29/2019 3 Alg Analysis 2 (f3)
5/71
-
7/29/2019 3 Alg Analysis 2 (f3)
6/71
Complexity Analysis
1. Computational and Asymptotic Complexity
2. Big-O Notation
3. Properties of Big-O Notation
4. Omega and Theta Notations
5. Example of Complexities
6. Finding Asymptotic Complexity: Examples
7. The Best, Average, and Worst Cases
-
7/29/2019 3 Alg Analysis 2 (f3)
7/71
Complexity Analysis
1. Computational and Asymptotic Complexity
2. Big-O Notation
3. Properties of Big-O Notation
4. Omega and Theta Notations
5. Example of Complexities
6. Finding Asymptotic Complexity: Examples
7. The Best, Average, and Worst Cases
-
7/29/2019 3 Alg Analysis 2 (f3)
8/71
"Big Omega" - Lower Bounding Running Time
-
7/29/2019 3 Alg Analysis 2 (f3)
9/71
0, nnallforncgnf
RNngandRNnf ::
cng
nf
nlim0
Complexity Analysis:
Big-Omega Notation
Let fand gbe two functions such that
Formal Definition:
ngnf
if there exists positive constants c and n0 such that
or if
So g(n) is an asymptotic lower-bound forf(n) as n increases
(g(n) bounds f(n) from below)
then we write
-
7/29/2019 3 Alg Analysis 2 (f3)
10/71
Complexity Analysis:
Big-Omega Notation
f(n) is Omega(g(n))
if f grows at least as fast as g.
Informal Definition:
Grow refers to the value off(n) as n increases
cg(n) is an approximation to f(n), bounding from below
-
7/29/2019 3 Alg Analysis 2 (f3)
11/71
"Theta" - Tightly Bounding Running Time!
-
7/29/2019 3 Alg Analysis 2 (f3)
12/71
021 , nnallforngcnfngc
RNngandRNnf ::
ccng
nf
n0,lim
Complexity Analysis:
Big-Theta Notation
Let fand gbe two functions such that
Formal Definition:
ngnf
if there exists positive constants c1, c2, and n0 such that
or if
So g(n) is an asymptotic tight-bound forf(n) as n increases
then we write
-
7/29/2019 3 Alg Analysis 2 (f3)
13/71
Complexity Analysis:
Big-Theta Notation
f(n) is Theta(g(n))
if f is essentially the same as g, to within a constant multiple.
Informal Definition:
Grow refers to the value off(n) as n increases
-
7/29/2019 3 Alg Analysis 2 (f3)
14/71
Complexity Analysis
1. Computational and Asymptotic Complexity
2. Big-O Notation
3. Properties of Big-O Notation
4. Omega and Theta Notations
5. Example of Complexities
6. Finding Asymptotic Complexity: Examples
7. The Best, Average, and Worst Cases
-
7/29/2019 3 Alg Analysis 2 (f3)
15/71
-
7/29/2019 3 Alg Analysis 2 (f3)
16/71
-
7/29/2019 3 Alg Analysis 2 (f3)
17/71
-
7/29/2019 3 Alg Analysis 2 (f3)
18/71
Complexity Analysis
1. Computational and Asymptotic Complexity
2. Big-O Notation
3. Properties of Big-O Notation
4. Omega and Theta Notations
5. Example of Complexities
6. Finding Asymptotic Complexity: Examples
7. The Best, Average, and Worst Cases
-
7/29/2019 3 Alg Analysis 2 (f3)
19/71
Determination of Time Complexity
for simple algorithms
Because of the approximationsavailable through Big-Oh , the actual
T(n) of an algorithm is not calculated,although T(n) may be determinedempirically.
Big-Oh is usually determined byapplication of some simple 5 rules:
-
7/29/2019 3 Alg Analysis 2 (f3)
20/71
RULE #1:
Simple program statements
are assumed to take a
constant amount of time which is
O(1)
i.e...
Not dependent upon n.
-
7/29/2019 3 Alg Analysis 2 (f3)
21/71
RULE #2:
Differences in execution time of
simple statements is ignored
-
7/29/2019 3 Alg Analysis 2 (f3)
22/71
RULE #3:
In conditional statements,
the worst case is always used.
-
7/29/2019 3 Alg Analysis 2 (f3)
23/71
The running time of a
sequence of steps
has the order of therunning time ofthe largest
(the sum rule)
RULE #4:
-
7/29/2019 3 Alg Analysis 2 (f3)
24/71
if two sequential steps have timesO(n) and O(n^2), then the runningtime of the sequence is O(n^2).
If n is large then the running timedue to n is insignificant whencompared to the running time for the
squared process (e.g... if n=1000then 1000 is not significant whencompared to 1000000)
RULE #4 (example)
-
7/29/2019 3 Alg Analysis 2 (f3)
25/71
If two processes are constructedsuch that second process is
repeated a number of times for
each n in the first process, thenO is equal to the product of the
orders of magnitude for both
products
(the product rule)
RULE #5:
-
7/29/2019 3 Alg Analysis 2 (f3)
26/71
For example, a two-dimensionalarray has one for loop insideanother and each internal loop isexecuted n times for each valueof the external loop.
The O is therefore n*n
RULE #5 (example)
-
7/29/2019 3 Alg Analysis 2 (f3)
27/71
Rules for Big-Oh (Summary)
If T = O(c*f(n)) for a constant c, thenT = O(f(n))
If T1 = O(f(n)) and T2 = O(g(n)) then
T1 + T2 = O(max(f(n), g(n))) the sum ruleIf T1 = O(f(n)) and T2 = O(g(n)) then
T1 * T2 = O(f(n) * g(n)) the product ruleIf f is a polynomial of degree k then
f(n) = O(nk)
-
7/29/2019 3 Alg Analysis 2 (f3)
28/71
1000log100 102 nnnnf
If f is a polynomial of degree k then
f(n) = O(nk)
)( 2nOnf
-
7/29/2019 3 Alg Analysis 2 (f3)
29/71
-
7/29/2019 3 Alg Analysis 2 (f3)
30/71
To understand these rules,
some simple algorithms will beanalyzed in their code form.
There are features in the code
which allow us to determine
O() by applying the rules in theprevious notes.
-
7/29/2019 3 Alg Analysis 2 (f3)
31/71
Code Features
Assignment : O(1)
Procedure Entry : O(1)
Procedure Exit : O(1)
if A then B else C :
time for test A + O(max{B,C})
loop :
sum over all iterations of the time for each iteration. Combine using Sum and Product rules.
Exception: Recursive Algorithms
-
7/29/2019 3 Alg Analysis 2 (f3)
32/71
Examples
Nested Loops
Sequential statements
Conditional statements
More nested loops
-
7/29/2019 3 Alg Analysis 2 (f3)
33/71
Nested Loops
Running time of a loop equals running time of code
within the loop times the number of iterations
Nested loops: analyze inside out
O(1)O(n)
-
7/29/2019 3 Alg Analysis 2 (f3)
34/71
Nested Loops
Running time of a loop equals running time of code
within the loop times the number of iterations
Nested loops: analyze inside out
O(1*n) =
O(n)
-
7/29/2019 3 Alg Analysis 2 (f3)
35/71
Nested Loops
Running time of a loop equals running time of code
within the loop times the number of iterations
Nested loops: analyze inside out
O(n)
O(n)
-
7/29/2019 3 Alg Analysis 2 (f3)
36/71
Nested Loops
Running time of a loop equals running time of code
within the loop times the number of iterations
Nested loops: analyze inside out
O(n*n) =
O(n2)
-
7/29/2019 3 Alg Analysis 2 (f3)
37/71
Nested Loops
Running time of a loop equals running time of codewithin the loop times the number of iterations
Nested loops: analyze inside out
Note: Running time grows with nesting rather than thelength of the code
O(n)O(n2)
-
7/29/2019 3 Alg Analysis 2 (f3)
38/71
Sequential Statements
For a sequence S1; S2; : : : Sk of statements,
running time is maximum of running times of
individual statements
Running time is:
O(n)
O(n2)
max(O(n), O(n2)) = O(n2)
-
7/29/2019 3 Alg Analysis 2 (f3)
39/71
Conditional Statements
The running time ofif ( cond ) S1
else S2
is running time ofcondplus the max of running
times ofS1 and S2
O(n)
O(1)
-
7/29/2019 3 Alg Analysis 2 (f3)
40/71
Conditional Statements
The running time ofif ( cond ) S1
else S2
is running time ofcondplus the max of running
times ofS1 and S2
= O(max(n, 1))
= O(n)
-
7/29/2019 3 Alg Analysis 2 (f3)
41/71
More Nested Loops
in
2
21
0 22
1nO
nnnnin
n
i
?
-
7/29/2019 3 Alg Analysis 2 (f3)
42/71
-
7/29/2019 3 Alg Analysis 2 (f3)
43/71
-
7/29/2019 3 Alg Analysis 2 (f3)
44/71
Wh t d th f ll i l ith d ?
-
7/29/2019 3 Alg Analysis 2 (f3)
45/71
What does the following algorithm do?
Analyze its worst-case running time, and express it using Big-Oh" notation.
-
7/29/2019 3 Alg Analysis 2 (f3)
46/71
Solution
This algorithm computes an.
The running timeof this algorithm is O(n) because:
the initial assignments take constant time
each iteration of the while loop
takes constant time
there are exactly n iterations
-
7/29/2019 3 Alg Analysis 2 (f3)
47/71
Example : Counting Sequences
void CompoundInterest ( float savings, float interest, int years )
begin
int tmp, result;
1. tmp = exp( 1+interest, years );
2. result = savings * tmp;
end
Total time taken = time for statement 1 + time for statement 2
= time for exp(.) call + time to multiply 2 integers
Total space taken = 2 integers or 8 bytes.
E l C ti S
-
7/29/2019 3 Alg Analysis 2 (f3)
48/71
Example : Counting Sequences
begin
int tmp, result;
tmp = exp( 1+interest, years );result = savings * tmp;
end
Unit Cost
(amount of work) Times
,*texp,t
1
1
void CompoundInterest ( float savings, float interest, int years )
*exp ttttt Total time taken
-
7/29/2019 3 Alg Analysis 2 (f3)
49/71
Example : Counting For-Loops
void printLine( int n, char ch )
begin
int I;
I=1;
while( I
-
7/29/2019 3 Alg Analysis 2 (f3)
50/71
Example : Counting For-Loops
Total time taken:
gotop ntntnttntt 1
pntnc 23
ctttt goto If
(eq 1)
Then (eq 1) simplifies to:
which is linearly proportional to n nO
-
7/29/2019 3 Alg Analysis 2 (f3)
51/71
Example : Counting For-Loops
void printLine( int n, char ch )
begin
int I;
I=1;
while( I
-
7/29/2019 3 Alg Analysis 2 (f3)
52/71
Example : Bubblesort
1
1
n
inO
1
1
1n
innO
Void Bubblesort( int[] A ) // A[1n]
1. begin
2. for i = 1 to n-1 do
3. for j=1 to n-i do
4. if A[j] > A[j+1] then5. swap A[j] with A[j+1]
6. end
Line 1,6: O(1)
Line 4,5: O(1)
Line 3-5: O(n-i)
)( 2nOLine 2-5:
-
7/29/2019 3 Alg Analysis 2 (f3)
53/71
When we analyze an algorithm,we need focus only on the
dominant (high cost) operationsand avoid a line-by-line exact
analysis.
-
7/29/2019 3 Alg Analysis 2 (f3)
54/71
Recipe for Determining O()
Break algorithm down into known
pieces
Identify relationships between pieces
Sequential is additive
Nested (loop / recursion) is multiplicative
Drop constants Keep only dominant factor for each
variable
G
-
7/29/2019 3 Alg Analysis 2 (f3)
55/71
Example : Polynomial Growth
2
21
nnnnnn
kkk n
for k=1 to n do // pseudocode
for j=1 to n dox = x + 1 // count this line or
// count additions/assignments
nT No. of additions for input size n
Example : Logarithmic Growth
-
7/29/2019 3 Alg Analysis 2 (f3)
56/71
Example : Logarithmic Growth
1. k=n;
2. while( k >= 1) // top3. x = x + 1; // count this line
4. k = k / 2; // k is halved
5. end
Iteration# value of k (at entry) #line 3 execd
1 n 1
2 n/2 1
3 n/22
14 n/23 1
m-1 n/2m-2 1
m n/2m-1 1 1
Example : Logarithmic Growth (cont)
-
7/29/2019 3 Alg Analysis 2 (f3)
57/71
1log
1log
log1
22
221
2
2
2
1
1
nm
mn
mnm
n
n
mm
m
m
1*1111 mnT
We are interested in what m is
(because that is the number of times line 3 is executed).
In other words,
To derive m, we look at the last iteration,
(eq 1)
)(lg1*1log 2 nOnnT
From (eq 1),
Example : Insertion Sort
-
7/29/2019 3 Alg Analysis 2 (f3)
58/71
Example : Insertion Sort
InsertionSort( int[] A ) // A is an n-element array
begin // Ignore function entry costs
int i,j; // Ignore compile time costs1.
for j=2 to length of A dokey = A[j];
i = j-1;
while i>0 and A[i] > key do
A[i+1] = A[i]
i = i-1 ;
endwhile // ignore goto costsA[i+1] = key;
endfor // ignore goto costs
end // ignore exit costs
Unit Cost
(amount of work)Times? ? O(?)
Example : Portion of a selection sort which does the sorting
-
7/29/2019 3 Alg Analysis 2 (f3)
59/71
Example : Portion of a selection sort which does the sorting
1 for (i= 0; i < n ; i++) ?
2 {
3 m = i; O(1)
4 for (j = i + 1; j
-
7/29/2019 3 Alg Analysis 2 (f3)
60/71
Complexity Analysis
1. Computational and Asymptotic Complexity
2. Big-O Notation
3. Properties of Big-O Notation
4. Omega and Theta Notations
5. Example of Complexities
6. Finding Asymptotic Complexity: Examples
7. The Best, Average, and Worst Cases
Worse Average and
-
7/29/2019 3 Alg Analysis 2 (f3)
61/71
Worse, Average and
Best Case Behaviors
Depends on input problem instance type
-
7/29/2019 3 Alg Analysis 2 (f3)
62/71
Worse, Average and Best Case Exact Timings
-
7/29/2019 3 Alg Analysis 2 (f3)
63/71
Worse, Average and Best Case Behaviors
The worse-case complexity of the algorithm is thefunction defined by the maximum number of steps takenon any instance of size n. It represents the curvepassing through the highest point on each column.
The best-case complexity is the function defined by theminimum number of steps taken on any instance of sizen. It represents the curve passing through the lowestpoint on each column.
The average-case complexity is the function defined bythe average number of steps taken on any instance ofsize n.
-
7/29/2019 3 Alg Analysis 2 (f3)
64/71
Basic Asymptotic Efficiency Classes
-
7/29/2019 3 Alg Analysis 2 (f3)
65/71
Basic Asymptotic Efficiency Classes
-
7/29/2019 3 Alg Analysis 2 (f3)
66/71
Basic Asymptotic Efficiency Classes
-
7/29/2019 3 Alg Analysis 2 (f3)
67/71
Basic Asymptotic Efficiency Classes
-
7/29/2019 3 Alg Analysis 2 (f3)
68/71
Basic Asymptotic Efficiency Classes
-
7/29/2019 3 Alg Analysis 2 (f3)
69/71
Basic Asymptotic Efficiency Classes
ff C
-
7/29/2019 3 Alg Analysis 2 (f3)
70/71
Basic Asymptotic Efficiency Classes
B i A i Effi i Cl
-
7/29/2019 3 Alg Analysis 2 (f3)
71/71
Basic Asymptotic Efficiency Classes