algo analysis and time complexity

91
Algorithm Analysis and complexity

Upload: lavamgmca

Post on 18-Jul-2016

11 views

Category:

Documents


1 download

DESCRIPTION

calculate the time complexity

TRANSCRIPT

Page 1: Algo Analysis and Time Complexity

Algorithm Analysis and complexitycomplexity

Page 2: Algo Analysis and Time Complexity

Agenda

• Algorithm Analysis

• What is complexity?

• Types of complexities• Types of complexities

• Methods of measuring complexity

Page 3: Algo Analysis and Time Complexity

Algorithm Analysis(1/5)

• Measures the efficiency of an algorithm or its implementation as a program as the input size becomes very large

• We evaluate a new algorithm by comparing its • We evaluate a new algorithm by comparing its performance with that of previous approaches– Comparisons are asymptotic analyses of classes of

algorithms

• We usually analyze the time required for an algorithm and the space required for a datastructure

Page 4: Algo Analysis and Time Complexity

Algorithm Analysis (2/5)

• Many criteria affect the running time of an algorithm, including

– speed of CPU, bus and peripheral hardware

– design think time, programming time and – design think time, programming time and debugging time

– language used and coding efficiency of the programmer

– quality of input (good, bad or average)

Page 5: Algo Analysis and Time Complexity

Algorithm Analysis (3/5)

• Programs derived from two algorithms for solving the same problem should both be

– Machine independent

– Language independent– Language independent

– Environment independent (load on the system,...)

– Amenable to mathematical study

– Realistic

Page 6: Algo Analysis and Time Complexity

Algorithm Analysis (4/5)

• In lieu of some standard benchmark conditions under which two programs can be run, we estimate the algorithm's performance based on the number of key and basic operations it requires to process an input of a given sizea given size

• For a given input size n we express the time T to run the algorithm as a function T(n)

• Concept of growth rate allows us to compare running time of two algorithms without writing two programs and running them on the same computer

Page 7: Algo Analysis and Time Complexity

Algorithm Analysis (5/5)

• Analysis is performed with respect to a computational model

• We will usually use a generic uni-processorrandom-access machine (RAM)– All memory equally expensive to access– All memory equally expensive to access

– No concurrent operations

– All reasonable instructions take unit time• Except, of course, function calls

– Constant word size• Unless we are explicitly manipulating bits

Page 8: Algo Analysis and Time Complexity

Complexity

• The complexity of an algorithm is simply the amount of work the algorithm performs to complete its task.

Page 9: Algo Analysis and Time Complexity

Complexity

• A measure of the performance of an algorithm

• An algorithm’s performance depends on– internal factors– internal factors

– external factors

Page 10: Algo Analysis and Time Complexity

External Factors

• Speed of the computer on which it is run

• Quality of the compiler• Quality of the compiler

• Size of the input to the algorithm

Page 11: Algo Analysis and Time Complexity

Internal Factors

The algorithm’s efficiency, in terms of:• Time required to run• Space (memory storage)required to run

Note:

Complexity measures the internal factors (usually more interested in time than space)

Page 12: Algo Analysis and Time Complexity

Two ways of finding complexity

• Experimental study

• Theoretical Analysis

Page 13: Algo Analysis and Time Complexity

Experimental study

• Write a program implementing the algorithm

• Run the program with inputs of varying size and composition

• Get an accurate measure of the actual running • Get an accurate measure of the actual running time

• Plot the results

Page 14: Algo Analysis and Time Complexity

Limitations of Experiments

• It is necessary to implement the algorithm, which may be difficult

• Results may not be indicative of the running time on other inputs not included in the experiment.experiment.

• In order to compare two algorithms, the same hardware and software environments must be used

• Experimental data though important is not sufficient

Page 15: Algo Analysis and Time Complexity

Theoretical Analysis

• Uses a high-level description of the algorithm instead of an implementation

• Characterizes running time as a function of the input size, n.input size, n.

• Takes into account all possible inputs

• Allows us to evaluate the speed of an algorithm independent of the hardware/software environment

Page 16: Algo Analysis and Time Complexity

Measure Running Time

• Basically, counting the number of basic operations required to solve a problem,

depending on the problem/instance size

• What are basic operations?• What are basic operations?

• How to measure instance size?

Page 17: Algo Analysis and Time Complexity

Basic Operations

• What can be done in a constant time,regardless of the size of the input

• Basic arithmetic operations, e.g. add, subtract, shift, …

• Relational operations, e.g. <, >, <=, …• Relational operations, e.g. <, >, <=, …• Logical operations, e.g. and, or, not• More complex arithmetic operations, e.g.multiply, divide, log, …• Etc.

Page 18: Algo Analysis and Time Complexity

Non-basic Operations

• When arithmetic, relational and logical

operations are not basic operations?

-Operations depend on the input size

-Arithmetic operations on very large-Arithmetic operations on very large

numbers

-Logical operations on very long bit strings

-Etc.

Page 19: Algo Analysis and Time Complexity

Instance Size

• Number of elementsArrays, matricesGraphs (nodes or vertices)Sets

• Number of bits/bytes• Number of bits/bytesNumbersBit strings

• Maximum valuesNumbers

Page 20: Algo Analysis and Time Complexity

Components of Program Space

• Program space = Instruction space + data space + stack space

• The instruction spaceis dependent on several factors.factors.– the compiler that generates the machine code– the target computer

Page 21: Algo Analysis and Time Complexity

Components of Program Space

• Data space

– very much dependent on the computer architecture and compiler

– The magnitude of the data that a program works with is another factoranother factor

char 1 float 4short 2 double 8int 2 long double 10long 4 pointer 2

Unit: bytes

Page 22: Algo Analysis and Time Complexity

Components of Program Space• Data space

– Choosing a “smaller” data type has an effect on the overall space usage of the program.

– Choosing the correct type is especially important when working with arrays.

• Environment Stack Space• Environment Stack Space– Every time a function is called, the following data are saved on

the stack.1. the return address2. the values of all local variables and value formal parameters3. the binding of all reference and const reference parameters

Page 23: Algo Analysis and Time Complexity

Space Complexity

• The space needed by an algorithm is the sum of a fixed part and a variable part

• The fixed part includes space for– Instructions– Instructions

– Simple variables

– Fixed size component variables

– Space for constants

– Etc..

Page 24: Algo Analysis and Time Complexity

Cont…

• The variable part includes space for– Component variables whose size is dependant on

the particular problem instance being solvedthe particular problem instance being solved

– Recursion stack space

– Etc..

Page 25: Algo Analysis and Time Complexity

Time Complexity

• The time complexity of a problem is – the number of steps that it takes to solve an

instance of the problem as a function of the size of the input (usually measured in bits), using the most efficient algorithm. efficient algorithm.

• The exact number of steps will depend on exactly what machine or language is being used.

• To avoid that problem, the Asymptotic notation is generally used.

Page 26: Algo Analysis and Time Complexity

Time Complexity

• How do we measure?1. Count a particular operation (operation counts)2. Count the number of steps (step counts)3. Asymptotic complexity3. Asymptotic complexity

Page 27: Algo Analysis and Time Complexity

Running Example: Insertion Sort

for (int i = 1; i < n; i++) // n is the number of

// elements in array

// insert a[i] into a[0:i-1]

int t = a[i];

int j;int j;

for (j = i - 1; j >= 0 && t < a[j]; j--)

a[j + 1] = a[j];

a[j + 1] = t;

Page 28: Algo Analysis and Time Complexity

Operation Count

for (int i = 1; i < n; i++)

for (j = i - 1; j >= 0 && t < a[j]; j--)

a[j + 1] = a[j];a[j + 1] = a[j];

• How many comparisons are made?

The number of compares depends ona[]sand t as well as on n.

Page 29: Algo Analysis and Time Complexity

Operation Count

• Worst case count = maximumcount

• Best case count = minimumcount

• Average count• Average count

Page 30: Algo Analysis and Time Complexity

Worst Case Operation Count

for (j = i - 1; j >= 0 && t < a[j]; j--)

a[j + 1] = a[j];

a = [1,2,3,4] and t = 0 4 compares

a = [1,2,3,4,…,i] and t = 0 i compares

Page 31: Algo Analysis and Time Complexity

Worst Case Operation Count

for (int i = 1; i < n; i++)

for (j = i - 1; j >= 0 && t < a[j] ; j--)

a[j + 1] = a[j];a[j + 1] = a[j];

total compares = 1+2+3+…+(n-1)

= (n-1)n/2

Page 32: Algo Analysis and Time Complexity

Step Count• The operation-count methodomits accounting for the

time spent on all but the chosen operation

• The step-count methodcount for all the time spent in all parts of the program

• A program stepis loosely defined to be a syntactically or semantically meaningful segment of a syntactically or semantically meaningful segment of a program for which the execution time is independent of the instance characteristics.

– However, n adds cannot be counted as one step.

Page 33: Algo Analysis and Time Complexity

Step Countsteps/execution (s/e)

for (int i = 1; i < n; i++) 1

0

// insert a[i] into a[0:i-1] 0

int t = a[i]; 1

int j; 0int j; 0

for (j = i - 1; j >= 0 && t < a[j]; j--) 1

a[j + 1] = a[j]; 1

a[j + 1] = t; 1

0

Page 34: Algo Analysis and Time Complexity

Step Counts/e frequency

for (int i = 1; i < n; i++) 1 n-1

0 0

// insert a[i] into a[0:i-1] 0 0

int t = a[i]; 1 n-1

int j; 0 0

for (j = i - 1; j >= 0 && t < a[j]; j--) 1 (n-1)n/2

a[j + 1] = a[j]; 1 (n-1)n/2

a[j + 1] = t; 1 n-1

0 n-1

Page 35: Algo Analysis and Time Complexity

Step Count

Total step counts

= (n-1) + 0 + 0 + (n-1) + 0 + (n-1)n/2 + (n-1)n/2 +(n-1) + (n-1)+(n-1) + (n-1)

= n2 + 3n – 4

Page 36: Algo Analysis and Time Complexity

Asymptotic Complexity• Two important reasons to determine operation and step

counts1. To compare the time complexitiesof two programs that

compute the same function2. To predict the growth in run timeas the instance

characteristic changes• Neither of the two yield a very accurate measure

– Operation counts: focus on “key” operations and ignore all othersall others

– Step counts: the notion of a step is itself in exact

• Asymptotic complexity provides meaningful statementsabout the time and space complexities of a program

Page 37: Algo Analysis and Time Complexity

Introduction

Why are asymptotic notations important?

• They give a simple characterization of an algorithm’s efficiency.

• They allow the comparison of the performances of various • They allow the comparison of the performances of various algorithms.

• For large values of components/inputs, the multiplicative constants and lower order terms of an exact running time are dominated by the effects of the input size (the number of components).

Page 38: Algo Analysis and Time Complexity

Best, average, worst-case complexity

• In some cases, it is important to consider the best, worst and/or average (or typical) performance of an algorithm.

• For example, when sorting a list into order, if it is already in order then the algorithm may it is already in order then the algorithm may have very little work to do.(Best case)

• The worst-case analysis gives a bound for all possible inputs (and may be easier to calculate than the average case)

Page 39: Algo Analysis and Time Complexity

Analysis

• Worst case

–Provides an upper bound on running time

–An absolute guarantee

• Average case• Average case

–Provides the expected running time

–Very useful, but treat with care: what is “average”?• Random (equally likely) inputs

• Real-life inputs

Page 40: Algo Analysis and Time Complexity

Asymptotic – Overview• A way to describe behavior of functions in the limit.

• Describe growth of functions.

• Focus on what’s important by abstracting away low order

terms and constant factors.

• Indicate running times of algorithms.

• A way to compare “sizes” of functions.

• Examples:• Examples:

– n steps vs. n+5 steps

– n steps vs. n2 steps

• Running time of an algorithm as a function of input size n for large n.

• Expressed using only the highest-order term in the expression for the exact running time.

Page 41: Algo Analysis and Time Complexity

Asymptotic Notations

• Ο− notation⎯⎯→Ο ≈≤ (big-oh, upper-bound)

• Ω− notation⎯⎯→Ω ≈≥(big-omega, lower bound)

• Θ− notation⎯⎯→Θ ≈=(theta, tight bound)

• ο − notation(little-oh, upper-bound not tight)

⎯⎯

⎯⎯

• ο − notation(little-oh, upper-bound not tight)

• ω − notation (little-omega, lower-bound but not tight)

Page 42: Algo Analysis and Time Complexity

Θ-notation (Average Case)

ΘΘΘΘ(g(n)) = f(n) : ∃∃∃∃ positive constants c1, c2, and n0,such that ∀∀∀∀n ≥≥≥≥ n0,

we have0 ≤≤≤≤ c g(n) ≤≤≤≤ f(n) ≤≤≤≤ c g(n)

For function g(n), we define Θ(g(n)), big-Theta of n, as the set:

we have0 ≤≤≤≤ c1g(n) ≤≤≤≤ f(n) ≤≤≤≤ c2g(n)

g(n) is an asymptotically tight bound for f(n).

Intuitively: Set of all functions thathave the same rate of growth as g(n).

Page 43: Algo Analysis and Time Complexity

Θ-notation (Average Case)

ΘΘΘΘ(g(n)) = f(n) : ∃∃∃∃ positive constants c1, c2, and n0,such that ∀∀∀∀n ≥≥≥≥ n0,

we have0 ≤≤≤≤ c g(n) ≤≤≤≤ f(n) ≤≤≤≤ c g(n)

For function g(n), we define Θ(g(n)), big-Theta of n, as the set:

we have0 ≤≤≤≤ c1g(n) ≤≤≤≤ f(n) ≤≤≤≤ c2g(n)

Technically, f(n) ∈ Θ(g(n)).

f(n) and g(n) are nonnegative, for large n.

Page 44: Algo Analysis and Time Complexity

Example• T(n) = f(n) = Q(g(n))

– if c1*g(n) <= f(n) <= c2*g(n) for all n > n0, where c1, c2 and n0 are constants > 0

f(n)c1*g(n)

c2*g(n)

n0 n

1/2n2-3n=Θ(n2) with c1=1/14,c2=1/2and n0=7

Page 45: Algo Analysis and Time Complexity

O-notation (Worst Case)

O(g(n)) = f(n) : ∃∃∃∃ positive constants c and n0,such that ∀∀∀∀n ≥≥≥≥ n0,

0 ≤≤≤≤ f(n) ≤≤≤≤ cg(n)

For function g(n), we define O(g(n)), big-O of n, as the set:

we have0 ≤≤≤≤ f(n) ≤≤≤≤ cg(n)

g(n) is an asymptotic upper bound for f(n).

Intuitively: Set of all functions whose rate of growth is the same as or lower than that of g(n).

f(n) = ΘΘΘΘ(g(n)) ⇒⇒⇒⇒ f(n) = O(g(n)).ΘΘΘΘ(g(n)) ⊂⊂⊂⊂ O(g(n)).

Example:2n2=O(n3) with c=1 and n0=2

Page 46: Algo Analysis and Time Complexity

Ω -notation (Best-Case)

ΩΩΩΩ(g(n)) = f(n) : ∃∃∃∃ positive constants c and n0,such that ∀∀∀∀n ≥≥≥≥ n0,

0 ≤≤≤≤ cg(n) ≤≤≤≤ f(n)

For function g(n), we define Ω(g(n)), big-Omega of n, as the set:

g(n) is an asymptotic lower bound for f(n).

Intuitively: Set of all functions whose rate of growth is the same as or higher than that of g(n).

f(n) = ΘΘΘΘ(g(n)) ⇒⇒⇒⇒ f(n) = ΩΩΩΩ(g(n)).ΘΘΘΘ(g(n)) ⊂⊂⊂⊂ ΩΩΩΩ(g(n)).

we have0 ≤≤≤≤ cg(n) ≤≤≤≤ f(n)

Page 47: Algo Analysis and Time Complexity

Ω Notation: Asymptotic Lower Bound

• T(n) = f(n) = Ω(g(n)) – if f(n) >= c*g(n) for all n > n0, where c and n0 are constants > 0

f(n)c*g(n)

nn0

– Example: T(n) = 2n + 5 is Ω(n). Why?

– 2n+5 >= 2n, for all n > 0

– T(n) = 5*n2 - 3*n is Ω(n2). Why?

– 5*n2 - 3*n >= 4*n2, for all n >= 4- √n=Ω(log n)

with c=1 and n0=16

Page 48: Algo Analysis and Time Complexity

Ω(g(n)), functions that grow at least as fast as g(n)

Θ(g(n)), functions that grow at the same rate as g(n)

>=

=

Θ(g(n)), functions that grow at the same rate as g(n)

O(g(n)), functions that grow no faster than g(n)

g(n)

<=

Page 49: Algo Analysis and Time Complexity

Relations Between Θ, O, Ω

Page 50: Algo Analysis and Time Complexity

L’Hôpital’s rule and Stirling’s formula

L’Hôpital’s rule: If limn→∞ f(n) = limn→∞ g(n) = ∞ and the derivatives f´, g´ exist, then

ff((nn))gg((nn))

limlimnn→∞→∞→∞→∞→∞→∞→∞→∞

= f f ´((nn))g g ´((nn))

limlimnn→∞→∞→∞→∞→∞→∞→∞→∞

Example: log Example: log nn vs. vs. nn

Stirling’s formula: n! ≈ (2πn)1/2 (n/e)n

Example: log Example: log nn vs. vs. nn

Example: Example: 22 nn vs. vs. nn!!

Page 51: Algo Analysis and Time Complexity

Orders of growth of some important functions

• All logarithmic functions loga n belong to the same classΘ(log n) no matter what the logarithm’s base a > 1 is

because

• All polynomials of the same degree k belong to the same class:

ann bba log/loglog =

aknk + ak-1nk-1 + … + a0 ∈ Θ(nk)

Exponential functions an have different orders of growth for different a’s

• order log n < order nα (α>0) < order an < order n! < order nn

Page 52: Algo Analysis and Time Complexity

o-notation

f(n) becomes insignificant relative to g(n) as n approaches infinity:

o(g(n)) = f(n): ∀∀∀∀ c > 0, ∃∃∃∃ n0 > 0 such that ∀∀∀∀ n ≥ n0, we have0 ≤ f(n) < cg(n).

For a given function g(n), the set little-o:

approaches infinity:

lim [f(n) / g(n)] = 0n→∞

g(n) is anupper bound for f(n) that is not asymptotically tight.

Observe the difference in this definition from previous ones. Why?

Page 53: Algo Analysis and Time Complexity

ω(g(n)) = f(n): ∀∀∀∀ c > 0, ∃∃∃∃ n0 > 0 such that ∀∀∀∀ n ≥ n0, we have0 ≤ cg(n) < f(n).

ω -notation

f(n) becomes arbitrarily large relative to g(n) as n approaches infinity:

For a given function g(n), the set little-omega:

g(n) as n approaches infinity:

lim [f(n) / g(n)] = ∞.n→∞

g(n) is a lower bound for f(n) that is not asymptotically tight.

Page 54: Algo Analysis and Time Complexity

Visualization of Asymptotic Growth

O(f(n))

f(n)

Ω(f(n))

o(f(n))

Θ(f(n))

n0

Ω(f(n))

ω(f(n))

Page 55: Algo Analysis and Time Complexity

Need for Asymptotic Notation

• For example, suppose the exact run-time T(n) of an algorithm on an input of size n is T(n) = 5n2 + 6n + 25 seconds. Then, since n is ≥0, we have 5n2 ≤ T(n) ≤ 6n2 for all n≥9.

• Thus we can say that T(n) is roughly proportional to

n2 for sufficiently large values of n.

ϵ

n2 for sufficiently large values of n.

• We write this as T(n)ϵ Θ(n2 ), or say that “T(n) is in the exact order of n2”.

Page 56: Algo Analysis and Time Complexity

Contd..

• Generally, an algorithm with a run-time of Θ(n log n)

will perform better than an algorithm with a run-time of order Θ(n2 ), provided that n is sufficiently large.

• However, for small values of n, the Θ(n2 ) algorithm may run faster due to having smaller constant factors. faster due to having smaller constant factors.

• The value of n at which the Θ(n log n) algorithm first outperforms the Θ(n2 ) algorithm is called the break-even point for the Θ(n log n) algorithm.

Page 57: Algo Analysis and Time Complexity

Examples

• Determine whether following is true or false.

• 100n + 5 ∈ O(n2)

Solution:

• 100n + 5 ≤ 100n + n for all n ≥ 5

• 101n ≤ 101n2• 101n ≤ 101n2

• Where c =101 and n0 = 5.

Page 58: Algo Analysis and Time Complexity

• Determine whether following is true or false.

• n2 + 10n ∈ O(n2)

Solution:

• When c=2, n0=10,

Because forBecause for

• n2 + 10n ≤ 2 g(n) for all n ≥ n0

• n2 + 10n ≤ 2 n2 for all n ≥ 10

Page 59: Algo Analysis and Time Complexity

• Show that 5n2 ∈∈∈∈ O(n2)Solution:

• For n ≥ 0, we can take c = 5, n0 = 0.

• 5n2 ≤ 5 n2 for n ≥ 0

• Show that n(n-1)/2 ∈∈∈∈ O(n2)• Show that n(n-1)/2 ∈∈∈∈ O(n2)Solution:

• For n ≥ 0, n(n-1)/2 ≤ n(n)/2 ≤ = n2, c = ½ and n0 = 0.

Page 60: Algo Analysis and Time Complexity

• Show that n2∈ O(n2 + 10n)

Solution:

• For n ≥ 0

• n2 ≤ 1 * (n2 + 10n)

c = 1 and n0 = 0.c = 1 and n0 = 0.

Page 61: Algo Analysis and Time Complexity

• Find the order of the function 3n + 2

Solution:

• t(n) = 3n + 2

• t(n) ≤ cg(n)

• 3n + 2 ≤ cn

• 3n + 2 ≤ 4n where c = 4• 3n + 2 ≤ 4n where c = 4

• If n = 1,

• 3 x 1 + 2 ≤ 4 i.e 5 ≤ 4

• If n = 2,

• 3 x 2 + 2 ≤ 8 i.e 8 ≤ 8

• If n = 3,

• 3 x 3 +2 ≤ 12 i.e 11 ≤ 12

• t(n)∈O(n) where c= 4 and n0=2, which is called breakeven point.

Page 62: Algo Analysis and Time Complexity

• Find the order of the function

• t(n) = 10n2 + 4n + 2

Solution:

• t(n) ≤ cg(n) for all n ≥ n0

• 10n2 + 4n + 2 ≤ cn2 where g(n) = n2

• 10n2 + 4n + 2 ≤ cn2 where c = 11

• If n = 1, 16 ≤ 11• If n = 1, 16 ≤ 11

• If n = 2, 50 ≤ 44

• If n = 3, 104 ≤ 99

• If n = 4, 178 ≤ 176

• If n = 5, 272 ≤ 275

• If n = 6, 386 ≤ 396 the given function 10n2 + 4n + 2 ∈ O(n2 ) when c = 11 and n0 = 5, which is breakeven point.

Page 63: Algo Analysis and Time Complexity

Omega(n2)

• 4 n2

• 6 n2 + 9

• 5 n2 + 2 n

• 4 n3 + 3 n2

• 6 n6 + 3 n4• 6 n6 + 3 n4

Thetha n2

• 4 n2

• 6 n2 + 9

• 5 n2 + 2 n

Page 64: Algo Analysis and Time Complexity

O(n2)

• 3 log n + 8

• 5n + 7

• 2 log n

• 4 n2

• 6 n2 + 9• 6 n2 + 9

• 5 n2 + 2 n

Page 65: Algo Analysis and Time Complexity

Typical Running Time Functions

• 1 (constant running time):

– Instructions are executed once or a few times

• logN (logarithmic)

– A big problem is solved by cutting the original problem

in smaller sizes, by a constant fraction at each step

• N (linear)• N (linear)

– A small amount of processing is done on each input

element

• N logN

– A problem is solved by dividing it into smaller problems,

solving them independently and combining the solution

Page 66: Algo Analysis and Time Complexity

Exercise

Page 67: Algo Analysis and Time Complexity

solution

Page 68: Algo Analysis and Time Complexity

Performance Classificationf(n) Classification

1 Constant: run time is fixed, and does not depend upon n. Most instructions are executed once, or only a few times, regardless of t he amount of information being processed

log n Logarithmic: when n increases, so does run time, but much slower. Commo n in programs which solve large problems by transforming them into smaller problems.

n Linear: run time varies directly with n. Typically, a small amount of processing is n Linear: run time varies directly with n. Typically, a small amount of processing is done on each element.

n log n When n doubles, run time slightly more than doubles. Comm on in programs which break a problem down into smaller sub-problems, sol ves them independently, then combines solutions

n2 Quadratic: when n doubles, runtime increases fourfold. Practical onl y for small problems; typically the program processes all pairs of input (e.g. in a double nested loop).

n3 Cubic: when n doubles, runtime increases eightfold

2n Exponential: when n doubles, run time squares. This is often the result of a natural, “brute force” solution.

Page 69: Algo Analysis and Time Complexity

Common growth rates

Time complexity Example

O(1) constant Adding to the front of a linked list

O(log N) log Finding an entry in a sorted array

O(N) linear Finding an entry in an unsorted arrayO(N) linear Finding an entry in an unsorted array

O(N log N) n-log-n Sorting n items by ‘divide-and-conquer’

O(N2) quadratic Shortest path between two nodes in a graph

O(N3) cubic Simultaneous linear equations

O(2N) exponential The Towers of Hanoi problem

Page 70: Algo Analysis and Time Complexity

Growth rates

O(N2)

O(Nlog N)

Number of Inputs

For a short time N2 isbetter than NlogN

Page 71: Algo Analysis and Time Complexity

Function of Growth rate

Page 72: Algo Analysis and Time Complexity

Standard Analysis Techniques

• Constant time statements• Analyzing Loops

• Analyzing Nested Loops

• Analyzing Sequence of Statements• Analyzing Sequence of Statements

• Analyzing Conditional Statements

Page 73: Algo Analysis and Time Complexity

Constant time statements

• Simplest case: O(1) time statements• Assignment statements of simple data types

int x = y;• Arithmetic operations:

x = 5 * y + 4 - z;x = 5 * y + 4 - z;• Array referencing:

A[j] = 5;• Array assignment:

∀ j, A[j] = 5;• Most conditional tests:

if (x < 12) ...

Page 74: Algo Analysis and Time Complexity

Analyzing Loops[1]

• Any loop has two parts:– How many iterations are performed?

– How many steps per iteration?

int sum = 0,j;

for (j=0; j < N; j++)

sum = sum +j;

– Loop executes N times (0..N-1)

– O(1) steps per iteration

• Total time is N * O(1) = O(N*1) = O(N)

Page 75: Algo Analysis and Time Complexity

Analyzing Loops[2]

• What about this for loop?int sum =0, j;for (j=0; j < 100; j++)

sum = sum +j;• Loop executes 100 times• Loop executes 100 times• O(1) steps per iteration• Total time is 100 * O(1) = O(100 * 1) = O(100)

Page 76: Algo Analysis and Time Complexity

Analyzing Nested Loops[1]

• Treat just like a single loop and evaluate each level of nesting as needed:

int j,k;for (j=0; j<N; j++)

for (k=N; k>0; k--)sum += k+j;sum += k+j;

• Start with outer loop:– How many iterations? N– How much time per iteration? Need to evaluate inner loop

• Inner loop uses O(N) time• Total time is N * O(N) = O(N*N) = O(N2)

Page 77: Algo Analysis and Time Complexity

Analyzing Nested Loops[2]

• What if the number of iterations of one loop depends on the counter of the other?

int j,k;for (j= 0; j < N; j++)for (j= 0; j < N; j++)

for (k=0; k < j; k++)sum += k+j;

• Analyze inner and outer loop together:• Number of iterations of the inner loop is:• 0 + 1 + 2 + ... + (N-1) = O(N2)

Page 78: Algo Analysis and Time Complexity

Analyzing Sequence of Statements

• For a sequence of statements, compute their complexity functions individually and add them up

for (j=0; j < N; j++)for (k =0; k < j; k++) O(N2)for (k =0; k < j; k++)

sum = sum + j*k;for (l=0; l < N; l++)

sum = sum -l;cout<<“Sum=”<<sum;

Total cost is O(N2) + O(N) +O(1) = O(N2)SUM RULE

O(N2)

O(N)

O(1)

Page 79: Algo Analysis and Time Complexity

Analyzing Conditional Statements

What about conditional statements such as

if (condition) statement1;

elsestatement2;statement2;

where statement1 runs in O(N) time and statement2 runs in O(N2) time?

We use "worst case" complexity: among all inputs of size N, what is the maximum running time?

The analysis for the example above is O(N2)

Page 80: Algo Analysis and Time Complexity

Best Case

• Best case is defined as whichinput of size n is cheapest among all inputs of size n.

• “The best case for my algorithm is n=1 because that is the fastest.” WRONG!because that is the fastest.” WRONG!

Misunderstanding

Page 81: Algo Analysis and Time Complexity

Some Properties of Big “O”

• Transitive property– If f is O(g) and g is O(h) then f is O(h)

• Product of upper bounds is upper bound for the product– If f is O(g) and h is O(r) then fh is O(gr)– If f is O(g) and h is O(r) then fh is O(gr)

• Exponential functions grow faster than polynomials– nk is O(bn ) " b > 1 and k ≥ 0

e.g. n20 is O( 1.05n)

• Logarithms grow more slowly than powers– logbn is O( nk) " b > 1 and k > 0

e.g.log2n is O( n0.5)

Page 82: Algo Analysis and Time Complexity

Comparision of two algorithms

Consider two algorithms, A and B, for solving a given problem.

TA(n),TB( n) is time complexity of A,B respectively (where n is a measure of the problem size. )

One possibility arises if we know the problem size a priori.

One possibility arises if we know the problem size a priori.

For example, suppose the problem size is n0 and TA(n0)<TB(n0). Then clearly algorithm A is better than algorithm B for problem size .

In the general case, we have no a priori knowledge of the problem size.

Page 83: Algo Analysis and Time Complexity

Cont..

• Limitation:– don't know the problem size beforehand

– it is not true that one of the functions is less than or equal the other over the entire range of problem equal the other over the entire range of problem sizes.

• we consider the asymptotic behavior of the two functions for very large problem sizes.

Page 84: Algo Analysis and Time Complexity

Time Complexity Vs Space Complexity

• Achieving both is difficult and best case

• There is always trade off

• If memory available is large– Need not compensate on Time Complexity– Need not compensate on Time Complexity

• If fastness of execution is not main concern, memory available is less– Can’t compensate on space complexity

Page 85: Algo Analysis and Time Complexity

Example

• Size of data = 10 MB

• Check if a word is present in the data or not

• Two ways– Better Space Complexity– Better Space Complexity

– Better Time Complexity

Page 86: Algo Analysis and Time Complexity

Contd..

– Load the entire data into main memory and check one by one

• Faster process but takes a lot of space

– Load data word–by-word into main memory and – Load data word–by-word into main memory and check

• Slower process but takes less space

Page 87: Algo Analysis and Time Complexity

Run these algorithms

For loopa. Sum=0;

for(i=0;i<N;i++)for(j=0;j<i*i;j++)for(k=0;k<j;k++)Sum++;

Compare the above "for loops" for different inputs

Page 88: Algo Analysis and Time Complexity

Example3. Conditional Statements

Sum=0;for(i=1;i<N;i++)for(j=1;j<i*i;j++)if( j%i==0)if( j%i==0)for(k=0;k<j;k++)Sum++;

Analyze the complexity of the above algorithm for different inputs

Page 89: Algo Analysis and Time Complexity

Summary

• Analysis of algorithms

• Complexity

• Even with High Speed Processor and large memory ,Asymptotically low algorithm is not memory ,Asymptotically low algorithm is not efficient

• Trade Off between Time Complexity and Space Complexity

Page 90: Algo Analysis and Time Complexity

References

• Fundamentals of Computer Algorithms

Ellis Horowitz,Sartaj Sahni,Sanguthevar Rajasekaran

• Algorithm Design• Algorithm Design

Micheal T. GoodRich,Robert Tamassia

• Analysis of Algorithms

Jeffrey J. McConnell

Page 91: Algo Analysis and Time Complexity

Thank YouThank You