1 algorithm analysis (big o). 2 complexity in examining algorithm efficiency we must understand the...

Post on 24-Dec-2015

241 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

1

Algorithm Analysis (Big O)

2

Complexity

In examining algorithm efficiency we must understand the idea of complexity Space complexity Time Complexity

3

Space Complexity

When memory was expensive we focused on making programs as space efficient as possible and developed schemes to make memory appear larger than it really was (virtual memory and memory paging schemes)

Space complexity is still important in the field of embedded computing (hand held computer based equipment like cell phones, palm devices, etc)

4

Time Complexity

Is the algorithm “fast enough” for my needs

How much longer will the algorithm take if I increase the amount of data it must process

Given a set of algorithms that accomplish the same thing, which is the right one to choose

5

Algorithm Efficiency

a measure of the amount of resources consumed in solving a problem of size n time space

Benchmarking: implement algorithm, run with some specific input and measure time taken better for comparing performance of processors than

for comparing performance of algorithms Big Oh (asymptotic analysis)

associates n, the problem size, with t, the processing time required to solve the

problem

6

Cases to examine

Best case if the algorithm is executed, the fewest

number of instructions are executed Average case

executing the algorithm produces path lengths that will on average be the same

Worst case executing the algorithm produces path

lengths that are always a maximum

7

Worst case analysisOf the three cases, only useful case

(from the standpoint of program design) is that of the worst case.

Worst case helps answer the software lifecycle question of: If its good enough today, will it be

good enough tomorrow?

8

Frequency Countexamine a piece of code and

predict the number of instructions to be executed

e.g. Code

for (int i=0; i< n ; i++)

{ cout << i;

p = p + i;

}

F.C.

n+1

n

n

____

3n+1

Inst #

1

2

3

for each instruction predict how many times each will be encountered as the code runs

totaling the counts produces the F.C. (frequency count)

9

Order of magnitude

In the previous example: best_case = avg_case = worst_case Example is based on fixed iteration n

By itself, Freq. Count is relatively meaningless Order of magnitude -> estimate of

performance vs. amount of data To convert F.C. to order of magnitude:

discard constant terms disregard coefficients pick the most significant term

Worst case path through algorithm -> order of magnitude will be Big O (i.e. O(n))

10

Another exampleCode

for (int i=0; i< n ; i++)

for int j=0 ; j < n; j++)

{ cout << i;

p = p + i;

}

F.C.

n+1

n(n+1)

n*n

n*n

Inst #

1

2

3

4

F.C.

n+1

n2+n

n2

n2

____

3n2+2n+1discarding constant terms produces : 3n2+2n

clearing coefficients : n2+n

picking the most significant term: n2

Big O = O(n2)

11

What is Big O

Big O rate at which algorithm performance

degrades as a function of the amount of data it is asked to handle

For example: O(n) -> performance degrades at a

linear rate O(n2) -> quadratic degradation

12

Common growth rates

13

Big Oh - Formal Definition

Definition of "big oh": f(n)=O(g(n)), iff there exist constants c and

n0 such that: f(n) <= c g(n) for all n>=n0

Thus, g(n) is an upper bound on f(n) Note:

f(n) = O(g(n)) is NOT the same as

O(g(n)) = f(n) The '=' is not the usual mathematical

operator "=" (it is not reflexive)

14

Comparing Algorithms and

ADT Data Structures

Big-O Notation

15

Algorithm Efficiency

a measure of the amount of resources consumed in solving a problem of size n time space

benchmarking – code the algorithm, run it with some specific input and measure time taken better for measuring and comparing the performance

of processors than for measuring and comparing the performance of algorithms

Big Oh (asymptotic analysis) provides a formula that associates n, the problem size, with t, the processing time required to solve the problem

16

big Oh measures an algorithm’s growth rate

how fast does the time required for an algorithm to execute increase as the size of the problem increases?

is an intrinsic property of the algorithm independent of particular machine or code

based on number of instructions executed for some algorithms is data-dependent meaningful for “large” problem sizes

17

Computing xn for n >= 0

iterative definition x * x * x .. * x (n times)

recursive definition x0 = 1 xn = x * xn-1 (for n > 0)

another recursive definition x0 = 1 xn = (xn/2)2 (for n > 0 and n is even) xn = x * (xn/2)2 (for n > 0 and n is odd)

18

Iterative Power function

algorithm's computing time (t) as a function of n is: 3n + 3t is on the order of f(n) - O[f(n)]O[3n + 3] is n

double IterPow (double X, int N) { double Result = 1; while (N > 0) { Result *= X; N--; { return Result; }

1 n+1 n n

1

Total instruction count: 3n+3

critical region

19

Recursive Power function

double RecPow (double X, int N) { if (N == 0) return 1; else return X * RecPow(X, N - 1); }

Base case Recursive case

1 1 1

1 + T(n-1)

total: 2 2 + T(n-1)

Number of times base case is executed: 1Number of times recursive case is executed: n

Algorithm's computing time (t) as a function of n is: 2n + 2O[2n + 2] is n

20

Another Power Function

double Pow3 (double X, int N) { if (N == 0) return 1; else { double halfPower = Pow3(X, N/2); if (N % 2 == 0) return halfPower * halfPower; else return X * halfPower * halfPower; }}

Base case Recursive case

1 1 1

T(n/2) 1 1(even)

1(odd)

total: 2 3 + T(n/2)

Number of times base case is executed: 1Number of times recursive case is executed: log2 nAlgorithm's computing time (t) as a function of n is: 3 log2 n + 2O[3 log2 n + 2] is log2 n

21

Computational Complexity

Computing time, T(n), of an algorithm is a function of the problem size (based on instruction count) T(n) for IterPow is: 3n + 3 T(n) for RecPow is: 2n + 2 T(n) for Pow3 is: 3 log2 n + 2

Computational complexity of an algorithm is the rate at which T(n) grows as the problem size grows is expressed using "big Oh" notation growth rate (big Oh) of 3n+3 and of 2n+2 is: n big Oh of 3 log2 n + 2 is: log2 n

22

Common big Ohs

constant O(1) logarithmic O(log2 N) linear O(N)n log n O(N log2 N)quadratic O(N2)cubic O(N3)exponential O(2N)

23

Comparing Growth Rates

Problem Size

T(n)

log2 n

n

n log2 nn22n

24

An Experiment Execution time (in seconds)

2^25 2^50 2^100

IterPow .71 1.15 2.03

RecPow 3.63 7.42 15.05

Pow3 .99 1.15 1.38

(1,000,000 repetitions)

25

Uses of big Ohcompare algorithms which perform

the same function search algorithms sorting algorithms

comparing data structures for an ADT each operation is an algorithm and has

a big Oh data structure chosen affects big Oh of

the ADT's operations

26

Comparing algorithms

Sequential search growth rate is O(n) average number

of comparisons done is n/2

Binary search growth rate is O(log2 n) average number of

comparisons done is 2((log2 n) -1)

n n/2 2((log2 n)-1) 100 50 12 500 250 161000 500 185000 2500 24

top related