comp3050 logarithms and exponents. a quick revision we know that log b a = c if a = b c we also know...

Post on 26-Mar-2015

214 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Comp3050

Logarithms and Exponents

A quick revision

We know thatlog b a = c if a = bc

We also know that if b = 2, we may omit writing the base of the algorithm

i.e. log 1024 = 10

Some rules for the algorithms

If a, b and c are positive real numbers then:

a.Logb ac = logb a + logb c

b.Logb a/c = logb a – logb c

c.Logb ac = c logb a

d.blogca = alog

cb

e.Logb a = logc a/logc b

f.Log 2 = 1

Exponent rules

We also know that(ba)c = bac

Babc = b a+c

Ba/bc = ba-c

Based on these rules:

2log n = nlog2

=n

More examples

4n = (22)n = 22n (exponent rule 1) Log 2n = n (rule 3)

What will be: log4n

More examples

Algorithm A uses 10nlogn operations. B uses n2 operations? Do you think that Algo A will always be better than Algo B? What value the situations may change (i.e. what will be the value for n0)

What if Algo B is nn

Explanation

We observe that Till n = 10, ALGO A is moreAfter n=10 Algo B is more

So it is not always that nlogn is less then n2

10nlogn n21 0 12 6.0206 43 14.31364 94 24.0824 165 34.9485 256 46.68908 367 59.15686 498 72.2472 649 85.88183 81

10 100 10011 114.5532 12112 129.5017 14413 144.8126 16914 160.4579 19615 176.4137 22516 192.6592 25617 209.1763 289

Three types of bounds

There are three bounds:

Big O, big (Omega) and big (theta)

We saw two function 10nlogn and n2

We also saw that for different n’s they differ

So if we plot them:

Graphs plotted

Big O

We may also call Big O as WORST CASE We may also call Big O as WORST CASE

For this example:

F(x) = 10nlogn

The Asymptotic O is O(n)

G(x) = n2

The Asymptotic O is O(n2)

Big

We may call Big as BEST CASE We may call Big as BEST CASE

Making big Omega

F(x) = 10nlogn G(x) = n2

So the function is

F(x) = (nlogn)

for c=10 and n0 = 10

As now nlogn >=cnlogn. When n>n0 (10)

Big

Analysis of Algorithms 15

Intuition for Asymptotic Notation

Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or equal to g(n)

big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than or equal to

g(n)big-Theta

f(n) is (g(n)) if f(n) is asymptotically equal to g(n)little-oh

f(n) is o(g(n)) if f(n) is asymptotically strictly less than g(n)little-omega

f(n) is (g(n)) if is asymptotically strictly greater than g(n)

General rule

Small o is called strict upper bound

F(n) = 12n2 + 6n will have o(n3)

And small omega will be

(n) -> strict lower bound

Question

Bill has an algorithm, find2D, to find an element x in an n * n array A. The algorithm find2D iterates over the rows of A, and calls the algorithm arrayFind, on each row, until x is found or it has searched all rows of A. What is the worst case running time of find2D in terms of n? What is the worst case running time of find2D in terms of N, where N is the total size of A? Would it be correct to say that find2D is a linear-time algorithm? Why or why not?

solution

More example

There are two prefixAverage algo’s PrefixAlgo is creating array B as each

element of Array B must be the average of the elements of Array A till ith location of array B

Algorithm 1

Algorithm 2

Result

The running time of algorithm prefixAvarages1 is O(n) for first and second term

For third term onwards it is O(n2)

So the running time would be O(n2)

Result

The running time of algorithm prefixAvarages2 is O(1) for first and second term

For third term onwards it is O(n)

So the running time would be O(n)

Example

Calculate Algo time

Loop1 Loop2 Loop3 Loop4 Loop5 – O(n4)

Example

Solution

top related