lecture3 - university of toronto236/sadia/3/lecture3.pdf · csc108h • lecture 4 algorithm...

28
CSC236 Week 3 >> BIG O, BIG OMEGA, BIG THETA Sadia ‘Rain’ Sharmin

Upload: others

Post on 08-Jul-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

CSC236 Week 3 >> BIG O, BIG OMEGA, BIG THETA

Sadia ‘Rain’ Sharmin

Page 2: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

CSC108H • LECTURE 4

AnnouncementsProblem set 1 is due next week!

Make sure you're attending tutorials for solutions to tutorial handouts.

Please check Piazza before emailing questions.

Page 3: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

Things computer scientists say ...

Page 4: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

CSC108H • LECTURE 4

Algorithm Runtime: we describe it in terms of the number of steps as a function of input size

Asymptotic notations are for describing the growth rate of functions. - constant factors don’t matter - only the highest-order term matters

Asymptotic Notations

Page 5: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

7

growth rate ranking of typical functions

grow slowly

grow fast

7

Page 6: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

ConcepTest

5

Page 7: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

Big-Oh, Big-Omega, Big-Theta

6

O( f(n) ): The set of functions that grows no faster than f(n)● asymptotic upper-bound on growth rate

Ω( f(n) ): The set of functions that grows no slower than f(n)● asymptotic lower-bound on growth rate

Θ( f(n) ): The set of functions that grows no faster and no slower than f(n)

● asymptotic tight-bound on growth rate

6

Page 8: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

The formal mathematical definition of big-Oh

Video explanation: https://www.youtube.com/watch?v=6Ol2JbwoJp0&t=11s

Page 9: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

9

n0

Chicken grows slower than turkey, or

chicken size is in O(turkey size).

What it really means:

● Baby chicken might be larger than baby turkey at the beginning.

● But after certain “breakpoint”, the chicken size will be surpassed by the turkey size.

● From the breakpoint on, the chicken size will always be smaller than the turkey size.

Analogy

Page 10: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

Definition of f(n) is in O(g(n))

Let f(n) and g(n) be two functions of n. By “f(n)’s growth rate is upper-bounded by g(n)”, we mean the following:

● f(n) is bounded from above by g(n) in some way, something like f(n) ⩽ g(n)

● constant factors don’t matter, so we’re okay with f(n) being bounded by g(n) multiplied by some constant factor, i.e., f(n) ⩽ cg(n)

● we only care about large inputs, so we just need the above inequality to be true when n is larger than some “breakpoint” n₀

So the formal definition of “f(n) is in O(g(n))” is the following

10

Page 11: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

Definition of big-Oh

11

cg(n)

f(n)

n₀

Beyond the breakpoint n₀, f(n) is upper-bounded by cg(n), where c is some wisely chosen constant multiplier.

Page 12: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

Knowing the definition, now we can write proofs for big-Oh.

The key is finding n₀ and c

13

Page 13: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

CSC108H • LECTURE 4

Prove that 100n + 10000 is in O(n).

Need to find some n0 and c such that this is true: 100n + 10000 <= c*n for all n >= n0

This c and n0 can be anything that makes this inequality work. There is no unique right answer. Your proof just has to make it clear that the c and n0 you chose will always work.

Example 1

Page 14: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

CSC108H • LECTURE 4

Prove that 100n + 10000 is in O(n).

We need to find an upper bound, so we can feel free to overestimate and simplify as needed. As long as we are definitely getting larger, we are ok. Trick is to make it closer to the other side of the inequality somehow.

Example 1

Note: See pg. 89 of course notes for solution to this
Page 15: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

CSC108H • LECTURE 4

Remember, you can choose a value of n as starting point.

- Add in positive terms - e.g. n <= n+n, if n >= 0

- Multiply by positive terms - e.g. 5n + 100 <= 5n + 100n, if n >=1

- Remove negative terms - e.g. 5n - 3 <= 5n, if n >= 0

Another trick that could sometimes come in handy is splitting up terms like 2n = n + n or n2 = n * n.

Big O Proof Tricks

Page 16: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

CSC108H • LECTURE 4

Prove that 5n2 - 3n + 20 is in O(n).

Example 2

Big O Proof Tricks Remember, you can choose a value of n as starting point. - Add in positive terms

- e.g. n <= n+n, if n >= 0 - Multiply by positive terms

- e.g. 5n + 100 <= 5n + 100n, if n >=1 - Remove negative terms

- e.g. 5n - 3 <= 5n, if n >= 0

O(n^2)
Page 17: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

22

Definition of big-Omega

The only difference

Page 18: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

CSC108H • LECTURE 4

Prove that 2n3 - 7n + 1 = Ω(n3).

Now we want to find a lower bound, so something that's always smaller than what we are given, from some breakpoint onward.

i.e. We must find c and n0 such that 2n3 - 7n + 1 >= c*n3 for all n >= n0

Example 3

Page 19: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

CSC108H • LECTURE 4

We need to find an lower bound, so we can feel free to underestimate and simplify as needed. As long as we are definitely getting smaller, we are ok.

- Multiply by negative terms - e.g. 5n2 - n >= 5n2 - n*n, if n >=1

- Remove positive terms - e.g. 5n2 + 2n >= 5n2, if n >= 1

Another trick that could sometimes come in handy is splitting up terms like 2n = n + n or n2 = n * n.

Big Omega Proof Tricks

Page 20: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

Write up the proofProof:

Choose n₀= 3, c = 1,

then for all n >= n₀,

2n³ - 7n + 1 = n³ + (n³ - 7n) + 1

>= n³ + 1 # because n >= 3

>= n³ = cn³

Therefore by definition of big-Omega, 2n³ - 7n + 1 is in Ω(n³)

25

Prove that 2n³ - 7n + 1 is in Ω(n³)

Page 21: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

Definition of big-Theta

28

In other words, if you want to prove big-Theta, just prove both big-Oh and big-Omega, separately.

Page 22: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

ConcepTest

30

Page 23: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

ConcepTest

31

Page 24: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

Exercise 1 on handout

Page 25: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

ConcepTest

33

Page 26: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

CSC108H • LECTURE 4

One way to calculate the time complexity is to count the total number of steps taken by the function.

To count total number of steps, we need to determine: - the steps per one execution - the total number of times (frequency) of execution

We then add up all these steps to get the total count.

Counting number of steps

Page 27: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

Example

36

● total = total + 1 occurs at least as often as any other operation

● So, we can count the number of times that this operation occurs, and use that as an asymptotic bound on the algorithm. (Why?)

Page 28: lecture3 - University of Toronto236/sadia/3/lecture3.pdf · CSC108H • LECTURE 4 Algorithm Runtime: we describe it in terms of the number of steps as a function of input size Asymptotic

Bounding the sum

39