SlideShare a Scribd company logo
1
AsymptoticNotationsand Analysis
Asymptotic Notations Properties
• Categorize algorithms based on asymptotic growth rate
• e.g. linear, quadratic, exponential
• Ignore small constant and small inputs
• Estimate upper bound and lower bound on growth rate of time
complexity function
• Describe running time of algorithm as n grows to .
Limitations
• not always useful for analysis on fixed-size inputs.
• All results are for sufficiently large inputs.
2
Asymptotic Notations
Asymptotic Notations , O, , o, 
 We use  to mean “order exactly”, (Tight Bound)
 O to mean “order at most”, (Tight Upper Bound)
  to mean “order at least”, (Tight Lower Bound)
 o to mean “upper bound”,
•  to mean “lower bound”,
Define a set of functions: which is in practice used to compare
two function sizes.
3
   
 
 
   

    
   
   
 .
for
bound
upper
ally
asymptotic
an
is
function
means
all
for
,
0
such that
and
constants
positive
exist
there
:
functions,
of
set
the
by
denoted
,
0
function
given
a
For
n
f
n
g
n
g
n
f
n
n
n
cg
n
f
n
c
n
f
n
g
n
g
n
g
o
o









Big-Oh Notation (O)
Intuitively:
Set of all functions whose rate of growth is the same as or lower than
that of g(n). f(n) is bounded above by g(n) for all sufficiently large n
We may write f(n) = O(g(n)) OR f(n)  O(g(n))
If f, g: N  R+, then we can define Big-Oh as
4
g(n) is an asymptotic upper bound for f(n).
 c > 0,  n0 > 0 and n  n0, 0  f(n)  c.g(n)
f(n)  O(g(n))
Big-Oh Notation (O)
5
Big-Oh Notation (O)
The idea behind the big-O notation is to establish an upper boundary
for the growth of a function f(n) for large n.
This boundary is specified by a function g(n) that is usually much
simpler than f(n).
We accept the constant C in the requirement
f(n)  Cg(n) whenever n > n0,
We are only interested in large n, so it is OK if
f(n) > Cg(n) for n  n0.
The relationship between f and g can be expressed by stating either
that g(n) is an upper bound on the value of f(n) or that in the long run ,
f grows at most as fast as g.
6
• As a simple illustrative example, we show that the function
2n2 + 5n + 6 is O(n2).
• For all n ≥ 1, it is the case that
2n2 + 5n + 6 ≤ 2n2 + 5n2 + 6n2 = 13n2
• Hence, we can take с = 13 and no =1, and the definition is
satisfied.
7
Example
Prove that 2n2 = O(n3)
Proof:
Assume that f(n) = 2n2, and g(n) = n3
f(n) = O(g(n)) ?
Now we have to find the existence of c and n0
f(n) ≤ c.g(n)  2n2 ≤ c.n3  2 ≤ c.n
if we take, c = 1 and n0= 2 OR
c = 2 and n0= 1 then
2n2 ≤ c.n3
Hence f(n) = O(g(n)), c = 1 and n0= 2
8
Example
Prove that n2 = O(n2)
Proof:
Assume that f(n) = n2, and g(n) = n2
f(n) = O(g(n)) ?
Now we have to find the existence of c and n0
f(n) ≤ c.g(n)  n2 ≤ c.n2  1 ≤ c
if we take, c = 1, n0= 1
Then
n2 ≤ c.n2 for c = 1 and n  1
Hence, n2 = O(n2), where c = 1 and n0= 1 9
Example
Prove that 1000.n2 + 1000.n = O(n2)
Proof:
Assume that f(n) = 1000.n2 + 1000.n, and g(n) = n2
We have to find existence of c and n0 such that
0 ≤ f(n) ≤ c.g(n) ꓯ n  n0
1000.n2 + 1000.n ≤ c.n2
for c = 1001, 1000.n2 + 1000.n ≤ 1001.n2
 1000.n ≤ n2  n2 - 1000.n  0  n (n-1000)  0,
this true for n  1000
Hence f(n) = O(g(n)) for c = 1001 and n0 = 1000
10
Example
Prove that n3 = O(n2)
Proof:
On contrary we assume that there exist some positive constants
c and n0 such that
0 ≤ n3 ≤ c.n2 ꓯ n  n0
n ≤ c
Since c is any fixed number and n is any arbitrary constant,
therefore n ≤ c is not possible in general.
Hence our supposition is wrong and n3 ≤ c.n2, for n  n0 is not
true for any combination of c and n0.
Hence, n3 = O(n2) does not hold
11
Example
Prove that 2n + 10 = O(n)
Proof:
Assume that f(n) = 2n + 10, and g(n) = n
f(n) = O(g(n)) ?
Now we have to find the existence of c and n0
f(n) ≤ c.g(n)  2n + 10 ≤ c.n  (c  2) n  10  n  10/(c  2)
c > 2 for n >0, we pick, c = 3, then n0= 10
Then
2n + 10 ≤ c.n for c = 3 and n  10
Hence, 2n + 10 = O(n), where c = 3 and n0= 10 12
Example
Prove that : 3n3 + 20n2 + 5 = O(n3)
Proof:
Need c > 0 and n0  1 such that
3n3 + 20n2 + 5 ≤ c.n3 for n  n0
This is true for c = 4, n0= 21
OR c = 28, n0= 1
Hence, 3n3 + 20n2 + 5 = O(n), where c = 4 and n0= 21
13
Example
Prove that : 10n + 500 = O(n)
Proof:
Function n will never be larger than the function 500 + 10 n, no
matter how large n gets.
However, there are constants c0 and n0 such that
500 + 10n ≤ c.n when n ≥ n0.
One choice for these constants is c = 20 and n0 = 50.
Other choices for c0 and n0:
• For example, any value of c > 20 will work for n0 = 50.
Therefore, 500 + 10n = O(n). 14
Example
Prove which of the following function is larger by order of growth?
(1/3)n or 17?
• Let’s check if
(1/3)n = O( 17)
(1/3)n ≤ c.17, which is true for c=1,n0 = 1
• Let’s check if
17 = O((1/3)n )
17 ≤ c. (1/3)n , which is true for c > 17. 3n
• And hence can’t be bounded for large n.
• That’s why (1/3)n is less in growth rate then 17.
15
Example
Prove or disprove 22n = O (2n )?
• o prove above argument we have to show
• 22n ≤ C . 2n
• 2n 2n ≤ C .2n
• This inequality holds only when
• C ≥ 2n ,
• which makes C to be non-constant.
• Hence we can’t bound 22n by O(2n)
16
Example
Prove that : 8n2 + 2n - 3  O(n2)
Proof:
Need c > 0 and n0  1 such that
8n2 + 2n - 3 ≤ c.n2 for n  n0
Consider the reasoning:
f(n) = 8n2 + 2n - 3 ≤ 8n2 + 2n ≤ 8n2 + 2n2 = 10n2
Hence, 8n2 + 2n - 3  O(n2), where c = 10 and n0= 1
17
Example
Can you bound 3n = O (2n ) ?
To prove above argument we have to show
3n ≤ C. 2n
3n ≤ C. 2n
3n ≤ (3/2)n 2n
This inequality holds only when C ≥ (3/2)n, which makes C to be non-
constant.
Hence we can’t bound 3n by O(2n)
18
Example
Which of the following function is larger by order of growth? N log N
or N1.5?
Note that g(N) = N 1.5 = N•N 0.5
Hence, between f(N) and g(N), we only need to compare growth
rate of log(N) and N 0.5
Equivalently, we can compare growth rate of log2N with N
Now, we can refer to the previously state result to figure out whether
f(N) or g(N) grows faster!
19
Example
   
 
 
   

    
   
   
 .
for
bound
lower
ally
asymptotic
an
is
function
that
means
,
all
for
0
such that
and
constants
positive
exist
there
:
functions,
of
set
the
by
denote
function
given
a
For
n
f
n
g
n
g
n
f
n
n
n
f
n
cg
n
c
n
f
n
g
n
g
n
g
o
o








Big-Omega Notation ()
Intuitively:
Set of all functions whose rate of growth is the same as or higher than
that of g(n).
We may write f(n) = (g(n)) OR f(n)  (g(n))
If f, g: N  R+, then we can define Big-Omega as
20
21
g(n) is an asymptotically lower bound for f(n).
Note the duality rule: t(n)(f(n))  f(n)O(t(n))
 c > 0,  n0 > 0 , n  n0, f(n)  c.g(n)
f(n)  (g(n))
Big-Omega Notation ()
Prove that 3n + 2  (n)
Proof:
Assume that f(n) = 3n + 2 , and g(n) = n
f(n)  (g(n)) ?
We have to find the existence of c and n0 such that
c.g(n) ≤ f(n) for all n  n0
c.n ≤ 3n + 2
At R.H.S a positive term is being added to 3n, which will make L.H.S ≤
R.H.S for all values of n, when c = 3.
Hence f(n)  (g(n)), for c = 3 and n0= 1
22
Example
Prove that 5.n2  (n)
Proof:
Assume that f(n) = 5.n2 , and g(n) = n
f(n)  (g(n)) ?
We have to find the existence of c and n0 such that
c.g(n) ≤ f(n) for all n  n0
c.n ≤ 5.n2  c ≤ 5.n
if we take, c = 5 and n0= 1 then
c.n ≤ 5.n2 for all n  n0
Hence f(n)  (g(n)), for c = 5 and n0= 1
23
Example
Prove that 5n2 + 2n - 3  (n2)
Proof:
Assume that f(n) = 5n2 + 2n - 3, and g(n) = n2
f(n)  (g(n)) ?
We have to find the existence of c and n0 such that
c.g(n) ≤ f(n) ꓯ n  n0
c.n2 ≤ 5.n2 + 2n - 3
We can take c = 5, given that 2n-3 is always positive.
2n-3 is always positive for n  2. Therefore n0 = 2.
And hence f(n)  (g(n)), for c = 5 and n0= 2
24
Example
Prove that 100.n + 5  (n2)
Proof:
Let f(n) = 100.n + 5, and g(n) = n2
Assume that f(n)  (g(n)) ?
Now if f(n)  (g(n)) then there exist c and n0 such that
c.g(n) ≤ f(n) for all n  n0
c.n2 ≤ 100.n + 5
For the above inequality to hold meaning f(n) grows
faster than g(n).
But which means g(n) is growing faster than f(n)
And hence f(n)  (g(n))
25
Example
 
 



 n
g
n
f
n
lim






0
5
100
lim 2
n n
n
26
Theta Notation ()
   
 
 
   

      
   
     
   .
for
bound
ally tight
asymptotic
an
is
and
factor,
constant
a
within
to
to
equal
is
function
means
all
for
0
such that
and
,
constants
positive
exist
there
:
functions,
of
set
the
by
denoted
function
given
a
For
2
1
2
1
n
f
n
g
n
g
n
f
n
g
n
f
n
n
n
g
c
n
f
n
g
c
n
c
c
n
f
n
g
n
g
n
g
o
o









Intuitively: Set of all functions that have same rate of growth as g(n).
When a problem is (n), this represents both an upper and lower
bound i.e. it is O(n) and (n) (no algorithmic gap)
We may write f(n) = (g(n)) OR f(n)  (g(n))
If f, g: N  R+, then we can define Big-Theta as
27
We say that g(n) is an asymptotically tight bound for f(n).
f(n)  (g(n))
 c1> 0, c2> 0,  n0 > 0,  n  n0, c2.g(n)  f(n)  c1.g(n)
Theta Notation ()
Prove that ½.n2 – ½.n = (n2)
Proof
Assume that f(n) = ½.n2 – ½.n, and g(n) = n2
f(n)  (g(n))?
We have to find the existence of c1, c2 and n0 such that
c1.g(n) ≤ f(n) ≤ c2.g(n) for all n  n0
Since, ½ n2 - ½ n ≤ ½ n2 n ≥ 0 if c2= ½ and
½ n2 - ½ n ≥ ½ n2 - ½ n . ½ n ( n ≥ 2 ) = ¼ n2, c1= ¼
Hence ½ n2 - ½ n ≤ ½ n2 ≤ ½ n2 - ½ n
c1.g(n) ≤ f(n) ≤ c2.g(n) n ≥ 2, c1= ¼, c2 = ½
Hence f(n)  (g(n))  ½.n2 – ½.n = (n2)
28
Example
Prove that 2.n2 + 3.n + 6  (n3)
Proof: Let f(n) = 2.n2 + 3.n + 6, and g(n) = n3
we have to show that f(n)  (g(n))
On contrary assume that f(n)  (g(n)) i.e. there exist some
positive constants c1, c2 and n0 such that:
c1.g(n) ≤ f(n) ≤ c2.g(n)
Solve for c2:
f(n) ≤ c2.g(n)  2n2 + 3n + 6 ≤ 2n2 + 3n2 + 6n2 ≤ c2n3  c = 11 and n0= 1
Solve for c1:
c1.g(n) ≤ f(n)  c1n3 ≤ 2n2 + 3n + 6  c1n3 ≤ 2n2 ≤ 2n2 + 3n + 6
c1.n ≤ 2, for large n this is not possible
Hence f(n)  (g(n))  2.n2 + 3.n + 6  (n3)
29
Example
Prove that 3.n + 2 = (n)
Proof: Let f(n) = 3.n + 2, and g(n) = n
Assume that f(n)  (g(n)) i.e. there exist some positive
constants c1, c2 and n0 such that:
c1.g(n) ≤ f(n) ≤ c2.g(n)
c1.n ≤ 3.n + 2 ≤ c2. n 
Take c1 = 3, as 3n ≤ 3n + 2 with n0 = 1
3n + 2 ≤ 3n + 2n ≤ c2. n  5n ≤ c2. n  5 ≤ c2
c1 = 3, c2 = 5, n0 = 1
Hence f(n)  (g(n))  3.n + 2 = (n)
30
Example
Prove that ½.n2 – 3.n = (n2)
Proof
Let f(n) = ½.n2 – 3.n, and g(n) = n2 f(n)  (g(n))?
We have to find the existence of c1, c2 and n0 such that
c1.g(n) ≤ f(n) ≤ c2.g(n) ꓯ n  n0
c1. n2 ≤ ½.n2 – 3.n ≤ c2. n2
Since, ½ n2 - 3 n ≤ ½ n2 n ≥ 1 if c2= ½ and
½ n2 - 3 n ≥ 1/4 n2 ( n ≥ 6 ), c1= ¼
c1.g(n) ≤ f(n) ≤ c2.g(n) n ≥ 6, c1= ¼, c2 = ½
Hence f(n)  (g(n))  ½.n2 – 3.n = (n2)
31
Example from the book
 
 
0
lim
n


 n
g
n
f
   
 
 
 
 
    
all
for
0
such that
constant
a
exists
there
,
constants
positive
any
for
:
functions,
of
set
the
by
denoted
,
0
function
given
a
For








o
o
n
n
n
cg
n
f
n
c
n
f
n
g
o
n
g
o
n
g
Little-Oh Notation
   ..
2
but
n
2
e.g., 2
2
2
n
o
n
n
o 

o-notation is used to denote a upper bound that is not
asymptotically tight.
f(n) becomes insignificant relative to g(n) as n approaches
infinity. g(n) is an upper bound for f(n), not asymptotically tight
32
Prove that 2n2  o(n3)
Proof:
Assume that f(n) = 2n2 , and g(n) = n3
f(n)  o(g(n)) ?
Now we have to find the existence n0 for any c
f(n) < c.g(n) this is true
 2n2 < c.n3  2 < c.n
This is true for any c, because for any arbitrary c we can
choose n0 such that the above inequality holds.
Hence f(n)  o(g(n))
33
Example
Prove that n2  o(n2)
Proof:
Assume that f(n) = n2 , and g(n) = n2
Now we have to show that f(n)  o(g(n))
Since
f(n) < c.g(n)  n2 < c.n2  1 ≤ c,
In our definition of small o, it was required to prove for any c
but here there is a constraint over c .
Hence, n2  o(n2), where c = 1 and n0= 1 34
Example
   
 
 
   

    
o
o
n
n
n
f
n
cg
n
c
n
f
n
g
n
g
n
g




all
for
0
such that
constant
a
exists
there
,
constants
positive
any
for
:
functions.
all
of
set
the
by
denote
,
function
given
a
For


 
 



 n
g
n
f
n
lim
Little-Omega Notation
   ..
2
but
2
n
e.g., 2
2
2
n
n
n 
 

Little- notation is used to denote a lower bound that is not
asymptotically tight.
f(n) becomes arbitrarily large relative to g(n) as n
approaches infinity
35
Prove that 5.n2  (n)
Proof:
Assume that f(n) = 5.n2 , and g(n) = n
f(n)  (g(n)) ?
We have to prove that for any c there exists n0 such that
c.g(n) < f(n) ꓯ n  n0
c.n < 5.n2  c < 5.n
This is true for any c, because for any arbitrary c e.g. c =
1000000, we can choose n0 = 1000000/5 = 200000 and the
above inequality does hold.
And hence f(n)  (g(n))
36
Example
Prove that 5.n + 10  (n)
Proof:
Assume that f(n) = 5.n + 10, and g(n) = n
f(n)  (g(n)) ?
We have to find the existence n0 for any c, such that
c.g(n) < f(n) ꓯ n  n0
c.n < 5.n + 10, if we take c = 16 then
16.n < 5.n + 10  11.n < 10 is not true for any positive integer.
Hence f(n)  (g(n))
37
Example
Prove that 100.n  (n2)
Proof:
Let f(n) = 100.n, and g(n) = n2
Assume that f(n)  (g(n))
Now if f(n)  (g(n)) then there n0 for any c such that
c.g(n) < f(n) ꓯ n  n0
 c.n2 < 100.n  c.n < 100
If we take c = 100, n < 1, not possible
Hence f(n)  (g(n)) i.e. 100.n  (n2)
38
Example
If f(n) = Θ(g(n)) we say that f(n) and g(n) grow at the same rate,
asymptotically
If f(n) = O(g(n)) and f(n) ≠ Ω(g(n)), then we say that f(n) is asymptotically
slower growing than g(n).
If f(n) = Ω(g(n)) and f(n) ≠ O(g(n)), then we say that f(n) is asymptotically
faster growing than g(n).
39
Asymptotic Functions Summary
It is not always possible to determine behaviour of an algorithm
using Θ-notation.
For example, given a problem with n inputs, we may have an
algorithm to solve it in a.n2 time when n is even and c.n time when
n is odd. OR
We may prove that an algorithm never uses more than e.n2 time
and never less than f.n time.
In either case we can neither claim (n) nor (n2) to be the order of
the time usage of the algorithm.
Big O and  notation will allow us to give at least partial information
Usefulness of Notations
40
To express the efficiency of our algorithms which of the three
notations should we use?
As computer scientist we generally like to express our algorithms as
big O since we would like to know the upper bounds of our
algorithms.
Why?
If we know the worse case then we can aim to improve it and/or
avoid it.
Usefulness of Notations
41
Even though it is correct to say “7n - 3 is O(n3)”, a better statement is
“7n - 3 is O(n)”, that is, one should make the approximation as tight
as possible
Simple Rule:
Drop lower order terms and constant factors
7n-3 is O(n)
8n2log n + 5n2 + n is O(n2log n)
Strictly speaking this use of the equals sign is incorrect
• the relationship is a set inclusion, not an equality
• f(n)  O(g(n)) is better
Usefulness of Notations
42
Big Oh Does Not Tell the Whole Story
Question?
• If two algorithms A and B have the same asymptotic complexity, say
O(n2), will the execution time of the two algorithms always be
same?
• How to select between the two algorithms having the same
asymptotic performance?
Answer:
• They may not be the same. There is this small matter of the
constant of proportionality.
• Suppose that A does ten operations for each data item, but
algorithm B only does three.
• It is reasonable to expect B to be faster than A even though both
have the same asymptotic performance. The reason is that
asymptotic analysis ignores constants of proportionality.
43
Algorithm_A {
set up the algorithm; /*taking 50 time units*/
read in n elements into array A; /* 3 units per element */
for (i = 0; i < n; i++) {
do operation1 on A[i]; /* takes 10 units */
do operation2 on A[i]; /* takes 5 units */
do operation3 on A[i]; /* takes 15 units */
}
}
44
Big Oh Does Not Tell the Whole Story
TA(n) = 50 + 3n + (10 + 5 + 15)*n
= 50 + 33*n
Algorithm_B {
set up the algorithm; /*taking 200 time units*/
read in n elements into array A; /* 3 units per element */
for (i = 0; i < n; i++) {
do operation1 on A[i]; /* takes 10 units */
do operation2 on A[i]; /* takes 5 units */
}
}
TB(n) = 200 + 3n + (10 + 5)*n
= 200 + 18*n
Both algorithms have time complexity O(n).
Algorithm A sets up faster than B, but does more operations on the data
Algorithm A is the better choice for small values of n.
For values of n > 10, algorithm B is the better choice
45
Big Oh Does Not Tell the Whole Story
A Misconception
A common misconception is that worst case running time is somehow
defined by big-Oh, and that best case is defined by big-Omega.
There is no formal relationship like this.
However, worst case and big-Oh are commonly used together,
because they are both techniques for finding an upper bound on
running time.
46
Relations over AsymptoticNotations
Maximum rule: O(f(n)+g(n)) = O( max(f(n),g(n)) )
Additive and Multiplicative Property:
If e(n) = O(g(n)) & f(n) = O(h(n)) then e(n) + f(n) = O(g(n) + h(n))
If e(n) = O(g(n)) & f(n) = O(h(n)) then e(n) • f(n) = O(g(n) • h(n))
Dichotomy Property: If f(n) = O(g(n)) & g(n) = O(f(n)) then f(n) = (g(n))
If f(n) = (g(n)) and g(n) = (f(n)) then f(n) = (g(n))
Reflexive Property: If f(n) = O(f(n)) and f(n) = (f(n)) and f(n) = (f(n))
f(n)  o(f(n)) and f(n)  (f(n))
Symmetry over : f(n) = (g(n))  g(n) = (f(n))
Transitivity Property:
f(n) = (g(n)) & g(n) = (h(n))  f(n) = (h(n))
f(n) = O(g(n)) & g(n) = O(h(n))  f(n) = O(h(n))
f(n) = (g(n)) & g(n) = (h(n))  f(n) = (h(n))
f(n) = o (g(n)) & g(n) = o (h(n))  f(n) = o (h(n))
f(n) = (g(n)) & g(n) = (h(n))  f(n) = (h(n))
47

More Related Content

PPTX
02 asymptotic notations
PPTX
Dr hasany 2467_16649_1_lec-2-zabist
PPT
asymptotic notations i
PPTX
Asymptotic Notation
PDF
02 CS316_algorithms_Asymptotic_Notations(2).pdf
PPTX
Asymptotic notations
PDF
Asymptotic Notation
PPT
Clase3 Notacion
02 asymptotic notations
Dr hasany 2467_16649_1_lec-2-zabist
asymptotic notations i
Asymptotic Notation
02 CS316_algorithms_Asymptotic_Notations(2).pdf
Asymptotic notations
Asymptotic Notation
Clase3 Notacion

Similar to asymptomatic notations and analysis in computer science (20)

PPTX
2_Asymptotic notations.pptx
PDF
DAA_LECT_2.pdf
PPT
02-asymp.ppt YJTYJTYFHYTYHFHTFTHFHTFTHFTHTHFT
PPT
ASYMTOTIC NOTATION ON DATA STRUCTURE AND ALGORITHM
PPT
AsymptoticAnalysis-goal of analysis of algorithms
PPT
DSA Asymptotic Notations and Functions.ppt
PPT
02-asymp.ppt01_Intro.ppt algorithm for preperation stu used
PPT
Asymptotic analysis
PPT
Design and analysis of algorithm ppt ppt
PPT
AsymptoticAnalysis.ppt
PPTX
Asymptotic notations for desing analysis of algorithm.pptx
PPT
Time complexity
PDF
asymptoticnotations-111102093214-phpapp01 (1).pdf
PPTX
CS 161 Section 1 Slides - Stanford University
PDF
CS-102 DS-class_01_02 Lectures Data .pdf
PDF
Asymptotic Growth of Functions
PPT
Analysis Of Algorithms I
PPTX
Asymptotic notation
PPTX
DAA Week 2 slide for design algorithm and analysis.pptx
2_Asymptotic notations.pptx
DAA_LECT_2.pdf
02-asymp.ppt YJTYJTYFHYTYHFHTFTHFHTFTHFTHTHFT
ASYMTOTIC NOTATION ON DATA STRUCTURE AND ALGORITHM
AsymptoticAnalysis-goal of analysis of algorithms
DSA Asymptotic Notations and Functions.ppt
02-asymp.ppt01_Intro.ppt algorithm for preperation stu used
Asymptotic analysis
Design and analysis of algorithm ppt ppt
AsymptoticAnalysis.ppt
Asymptotic notations for desing analysis of algorithm.pptx
Time complexity
asymptoticnotations-111102093214-phpapp01 (1).pdf
CS 161 Section 1 Slides - Stanford University
CS-102 DS-class_01_02 Lectures Data .pdf
Asymptotic Growth of Functions
Analysis Of Algorithms I
Asymptotic notation
DAA Week 2 slide for design algorithm and analysis.pptx
Ad

Recently uploaded (20)

PDF
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
PDF
Classroom Observation Tools for Teachers
PDF
Module 4: Burden of Disease Tutorial Slides S2 2025
PDF
Complications of Minimal Access Surgery at WLH
PDF
O5-L3 Freight Transport Ops (International) V1.pdf
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PDF
BÀI TẬP BỔ TRỢ 4 KỸ NĂNG TIẾNG ANH 9 GLOBAL SUCCESS - CẢ NĂM - BÁM SÁT FORM Đ...
PDF
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
PDF
O7-L3 Supply Chain Operations - ICLT Program
PPTX
Pharma ospi slides which help in ospi learning
PPTX
Microbial diseases, their pathogenesis and prophylaxis
PPTX
Cell Structure & Organelles in detailed.
PPTX
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
PDF
STATICS OF THE RIGID BODIES Hibbelers.pdf
PDF
Microbial disease of the cardiovascular and lymphatic systems
PDF
Anesthesia in Laparoscopic Surgery in India
PPTX
Renaissance Architecture: A Journey from Faith to Humanism
PDF
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
PDF
Origin of periodic table-Mendeleev’s Periodic-Modern Periodic table
PPTX
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
Classroom Observation Tools for Teachers
Module 4: Burden of Disease Tutorial Slides S2 2025
Complications of Minimal Access Surgery at WLH
O5-L3 Freight Transport Ops (International) V1.pdf
Final Presentation General Medicine 03-08-2024.pptx
BÀI TẬP BỔ TRỢ 4 KỸ NĂNG TIẾNG ANH 9 GLOBAL SUCCESS - CẢ NĂM - BÁM SÁT FORM Đ...
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
O7-L3 Supply Chain Operations - ICLT Program
Pharma ospi slides which help in ospi learning
Microbial diseases, their pathogenesis and prophylaxis
Cell Structure & Organelles in detailed.
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
STATICS OF THE RIGID BODIES Hibbelers.pdf
Microbial disease of the cardiovascular and lymphatic systems
Anesthesia in Laparoscopic Surgery in India
Renaissance Architecture: A Journey from Faith to Humanism
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
Origin of periodic table-Mendeleev’s Periodic-Modern Periodic table
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
Ad

asymptomatic notations and analysis in computer science

  • 2. Asymptotic Notations Properties • Categorize algorithms based on asymptotic growth rate • e.g. linear, quadratic, exponential • Ignore small constant and small inputs • Estimate upper bound and lower bound on growth rate of time complexity function • Describe running time of algorithm as n grows to . Limitations • not always useful for analysis on fixed-size inputs. • All results are for sufficiently large inputs. 2
  • 3. Asymptotic Notations Asymptotic Notations , O, , o,   We use  to mean “order exactly”, (Tight Bound)  O to mean “order at most”, (Tight Upper Bound)   to mean “order at least”, (Tight Lower Bound)  o to mean “upper bound”, •  to mean “lower bound”, Define a set of functions: which is in practice used to compare two function sizes. 3
  • 4.                            . for bound upper ally asymptotic an is function means all for , 0 such that and constants positive exist there : functions, of set the by denoted , 0 function given a For n f n g n g n f n n n cg n f n c n f n g n g n g o o          Big-Oh Notation (O) Intuitively: Set of all functions whose rate of growth is the same as or lower than that of g(n). f(n) is bounded above by g(n) for all sufficiently large n We may write f(n) = O(g(n)) OR f(n)  O(g(n)) If f, g: N  R+, then we can define Big-Oh as 4
  • 5. g(n) is an asymptotic upper bound for f(n).  c > 0,  n0 > 0 and n  n0, 0  f(n)  c.g(n) f(n)  O(g(n)) Big-Oh Notation (O) 5
  • 6. Big-Oh Notation (O) The idea behind the big-O notation is to establish an upper boundary for the growth of a function f(n) for large n. This boundary is specified by a function g(n) that is usually much simpler than f(n). We accept the constant C in the requirement f(n)  Cg(n) whenever n > n0, We are only interested in large n, so it is OK if f(n) > Cg(n) for n  n0. The relationship between f and g can be expressed by stating either that g(n) is an upper bound on the value of f(n) or that in the long run , f grows at most as fast as g. 6
  • 7. • As a simple illustrative example, we show that the function 2n2 + 5n + 6 is O(n2). • For all n ≥ 1, it is the case that 2n2 + 5n + 6 ≤ 2n2 + 5n2 + 6n2 = 13n2 • Hence, we can take с = 13 and no =1, and the definition is satisfied. 7 Example
  • 8. Prove that 2n2 = O(n3) Proof: Assume that f(n) = 2n2, and g(n) = n3 f(n) = O(g(n)) ? Now we have to find the existence of c and n0 f(n) ≤ c.g(n)  2n2 ≤ c.n3  2 ≤ c.n if we take, c = 1 and n0= 2 OR c = 2 and n0= 1 then 2n2 ≤ c.n3 Hence f(n) = O(g(n)), c = 1 and n0= 2 8 Example
  • 9. Prove that n2 = O(n2) Proof: Assume that f(n) = n2, and g(n) = n2 f(n) = O(g(n)) ? Now we have to find the existence of c and n0 f(n) ≤ c.g(n)  n2 ≤ c.n2  1 ≤ c if we take, c = 1, n0= 1 Then n2 ≤ c.n2 for c = 1 and n  1 Hence, n2 = O(n2), where c = 1 and n0= 1 9 Example
  • 10. Prove that 1000.n2 + 1000.n = O(n2) Proof: Assume that f(n) = 1000.n2 + 1000.n, and g(n) = n2 We have to find existence of c and n0 such that 0 ≤ f(n) ≤ c.g(n) ꓯ n  n0 1000.n2 + 1000.n ≤ c.n2 for c = 1001, 1000.n2 + 1000.n ≤ 1001.n2  1000.n ≤ n2  n2 - 1000.n  0  n (n-1000)  0, this true for n  1000 Hence f(n) = O(g(n)) for c = 1001 and n0 = 1000 10 Example
  • 11. Prove that n3 = O(n2) Proof: On contrary we assume that there exist some positive constants c and n0 such that 0 ≤ n3 ≤ c.n2 ꓯ n  n0 n ≤ c Since c is any fixed number and n is any arbitrary constant, therefore n ≤ c is not possible in general. Hence our supposition is wrong and n3 ≤ c.n2, for n  n0 is not true for any combination of c and n0. Hence, n3 = O(n2) does not hold 11 Example
  • 12. Prove that 2n + 10 = O(n) Proof: Assume that f(n) = 2n + 10, and g(n) = n f(n) = O(g(n)) ? Now we have to find the existence of c and n0 f(n) ≤ c.g(n)  2n + 10 ≤ c.n  (c  2) n  10  n  10/(c  2) c > 2 for n >0, we pick, c = 3, then n0= 10 Then 2n + 10 ≤ c.n for c = 3 and n  10 Hence, 2n + 10 = O(n), where c = 3 and n0= 10 12 Example
  • 13. Prove that : 3n3 + 20n2 + 5 = O(n3) Proof: Need c > 0 and n0  1 such that 3n3 + 20n2 + 5 ≤ c.n3 for n  n0 This is true for c = 4, n0= 21 OR c = 28, n0= 1 Hence, 3n3 + 20n2 + 5 = O(n), where c = 4 and n0= 21 13 Example
  • 14. Prove that : 10n + 500 = O(n) Proof: Function n will never be larger than the function 500 + 10 n, no matter how large n gets. However, there are constants c0 and n0 such that 500 + 10n ≤ c.n when n ≥ n0. One choice for these constants is c = 20 and n0 = 50. Other choices for c0 and n0: • For example, any value of c > 20 will work for n0 = 50. Therefore, 500 + 10n = O(n). 14 Example
  • 15. Prove which of the following function is larger by order of growth? (1/3)n or 17? • Let’s check if (1/3)n = O( 17) (1/3)n ≤ c.17, which is true for c=1,n0 = 1 • Let’s check if 17 = O((1/3)n ) 17 ≤ c. (1/3)n , which is true for c > 17. 3n • And hence can’t be bounded for large n. • That’s why (1/3)n is less in growth rate then 17. 15 Example
  • 16. Prove or disprove 22n = O (2n )? • o prove above argument we have to show • 22n ≤ C . 2n • 2n 2n ≤ C .2n • This inequality holds only when • C ≥ 2n , • which makes C to be non-constant. • Hence we can’t bound 22n by O(2n) 16 Example
  • 17. Prove that : 8n2 + 2n - 3  O(n2) Proof: Need c > 0 and n0  1 such that 8n2 + 2n - 3 ≤ c.n2 for n  n0 Consider the reasoning: f(n) = 8n2 + 2n - 3 ≤ 8n2 + 2n ≤ 8n2 + 2n2 = 10n2 Hence, 8n2 + 2n - 3  O(n2), where c = 10 and n0= 1 17 Example
  • 18. Can you bound 3n = O (2n ) ? To prove above argument we have to show 3n ≤ C. 2n 3n ≤ C. 2n 3n ≤ (3/2)n 2n This inequality holds only when C ≥ (3/2)n, which makes C to be non- constant. Hence we can’t bound 3n by O(2n) 18 Example
  • 19. Which of the following function is larger by order of growth? N log N or N1.5? Note that g(N) = N 1.5 = N•N 0.5 Hence, between f(N) and g(N), we only need to compare growth rate of log(N) and N 0.5 Equivalently, we can compare growth rate of log2N with N Now, we can refer to the previously state result to figure out whether f(N) or g(N) grows faster! 19 Example
  • 20.                            . for bound lower ally asymptotic an is function that means , all for 0 such that and constants positive exist there : functions, of set the by denote function given a For n f n g n g n f n n n f n cg n c n f n g n g n g o o         Big-Omega Notation () Intuitively: Set of all functions whose rate of growth is the same as or higher than that of g(n). We may write f(n) = (g(n)) OR f(n)  (g(n)) If f, g: N  R+, then we can define Big-Omega as 20
  • 21. 21 g(n) is an asymptotically lower bound for f(n). Note the duality rule: t(n)(f(n))  f(n)O(t(n))  c > 0,  n0 > 0 , n  n0, f(n)  c.g(n) f(n)  (g(n)) Big-Omega Notation ()
  • 22. Prove that 3n + 2  (n) Proof: Assume that f(n) = 3n + 2 , and g(n) = n f(n)  (g(n)) ? We have to find the existence of c and n0 such that c.g(n) ≤ f(n) for all n  n0 c.n ≤ 3n + 2 At R.H.S a positive term is being added to 3n, which will make L.H.S ≤ R.H.S for all values of n, when c = 3. Hence f(n)  (g(n)), for c = 3 and n0= 1 22 Example
  • 23. Prove that 5.n2  (n) Proof: Assume that f(n) = 5.n2 , and g(n) = n f(n)  (g(n)) ? We have to find the existence of c and n0 such that c.g(n) ≤ f(n) for all n  n0 c.n ≤ 5.n2  c ≤ 5.n if we take, c = 5 and n0= 1 then c.n ≤ 5.n2 for all n  n0 Hence f(n)  (g(n)), for c = 5 and n0= 1 23 Example
  • 24. Prove that 5n2 + 2n - 3  (n2) Proof: Assume that f(n) = 5n2 + 2n - 3, and g(n) = n2 f(n)  (g(n)) ? We have to find the existence of c and n0 such that c.g(n) ≤ f(n) ꓯ n  n0 c.n2 ≤ 5.n2 + 2n - 3 We can take c = 5, given that 2n-3 is always positive. 2n-3 is always positive for n  2. Therefore n0 = 2. And hence f(n)  (g(n)), for c = 5 and n0= 2 24 Example
  • 25. Prove that 100.n + 5  (n2) Proof: Let f(n) = 100.n + 5, and g(n) = n2 Assume that f(n)  (g(n)) ? Now if f(n)  (g(n)) then there exist c and n0 such that c.g(n) ≤ f(n) for all n  n0 c.n2 ≤ 100.n + 5 For the above inequality to hold meaning f(n) grows faster than g(n). But which means g(n) is growing faster than f(n) And hence f(n)  (g(n)) 25 Example         n g n f n lim       0 5 100 lim 2 n n n
  • 26. 26 Theta Notation ()                                  . for bound ally tight asymptotic an is and factor, constant a within to to equal is function means all for 0 such that and , constants positive exist there : functions, of set the by denoted function given a For 2 1 2 1 n f n g n g n f n g n f n n n g c n f n g c n c c n f n g n g n g o o          Intuitively: Set of all functions that have same rate of growth as g(n). When a problem is (n), this represents both an upper and lower bound i.e. it is O(n) and (n) (no algorithmic gap) We may write f(n) = (g(n)) OR f(n)  (g(n)) If f, g: N  R+, then we can define Big-Theta as
  • 27. 27 We say that g(n) is an asymptotically tight bound for f(n). f(n)  (g(n))  c1> 0, c2> 0,  n0 > 0,  n  n0, c2.g(n)  f(n)  c1.g(n) Theta Notation ()
  • 28. Prove that ½.n2 – ½.n = (n2) Proof Assume that f(n) = ½.n2 – ½.n, and g(n) = n2 f(n)  (g(n))? We have to find the existence of c1, c2 and n0 such that c1.g(n) ≤ f(n) ≤ c2.g(n) for all n  n0 Since, ½ n2 - ½ n ≤ ½ n2 n ≥ 0 if c2= ½ and ½ n2 - ½ n ≥ ½ n2 - ½ n . ½ n ( n ≥ 2 ) = ¼ n2, c1= ¼ Hence ½ n2 - ½ n ≤ ½ n2 ≤ ½ n2 - ½ n c1.g(n) ≤ f(n) ≤ c2.g(n) n ≥ 2, c1= ¼, c2 = ½ Hence f(n)  (g(n))  ½.n2 – ½.n = (n2) 28 Example
  • 29. Prove that 2.n2 + 3.n + 6  (n3) Proof: Let f(n) = 2.n2 + 3.n + 6, and g(n) = n3 we have to show that f(n)  (g(n)) On contrary assume that f(n)  (g(n)) i.e. there exist some positive constants c1, c2 and n0 such that: c1.g(n) ≤ f(n) ≤ c2.g(n) Solve for c2: f(n) ≤ c2.g(n)  2n2 + 3n + 6 ≤ 2n2 + 3n2 + 6n2 ≤ c2n3  c = 11 and n0= 1 Solve for c1: c1.g(n) ≤ f(n)  c1n3 ≤ 2n2 + 3n + 6  c1n3 ≤ 2n2 ≤ 2n2 + 3n + 6 c1.n ≤ 2, for large n this is not possible Hence f(n)  (g(n))  2.n2 + 3.n + 6  (n3) 29 Example
  • 30. Prove that 3.n + 2 = (n) Proof: Let f(n) = 3.n + 2, and g(n) = n Assume that f(n)  (g(n)) i.e. there exist some positive constants c1, c2 and n0 such that: c1.g(n) ≤ f(n) ≤ c2.g(n) c1.n ≤ 3.n + 2 ≤ c2. n  Take c1 = 3, as 3n ≤ 3n + 2 with n0 = 1 3n + 2 ≤ 3n + 2n ≤ c2. n  5n ≤ c2. n  5 ≤ c2 c1 = 3, c2 = 5, n0 = 1 Hence f(n)  (g(n))  3.n + 2 = (n) 30 Example
  • 31. Prove that ½.n2 – 3.n = (n2) Proof Let f(n) = ½.n2 – 3.n, and g(n) = n2 f(n)  (g(n))? We have to find the existence of c1, c2 and n0 such that c1.g(n) ≤ f(n) ≤ c2.g(n) ꓯ n  n0 c1. n2 ≤ ½.n2 – 3.n ≤ c2. n2 Since, ½ n2 - 3 n ≤ ½ n2 n ≥ 1 if c2= ½ and ½ n2 - 3 n ≥ 1/4 n2 ( n ≥ 6 ), c1= ¼ c1.g(n) ≤ f(n) ≤ c2.g(n) n ≥ 6, c1= ¼, c2 = ½ Hence f(n)  (g(n))  ½.n2 – 3.n = (n2) 31 Example from the book
  • 32.     0 lim n    n g n f                  all for 0 such that constant a exists there , constants positive any for : functions, of set the by denoted , 0 function given a For         o o n n n cg n f n c n f n g o n g o n g Little-Oh Notation    .. 2 but n 2 e.g., 2 2 2 n o n n o   o-notation is used to denote a upper bound that is not asymptotically tight. f(n) becomes insignificant relative to g(n) as n approaches infinity. g(n) is an upper bound for f(n), not asymptotically tight 32
  • 33. Prove that 2n2  o(n3) Proof: Assume that f(n) = 2n2 , and g(n) = n3 f(n)  o(g(n)) ? Now we have to find the existence n0 for any c f(n) < c.g(n) this is true  2n2 < c.n3  2 < c.n This is true for any c, because for any arbitrary c we can choose n0 such that the above inequality holds. Hence f(n)  o(g(n)) 33 Example
  • 34. Prove that n2  o(n2) Proof: Assume that f(n) = n2 , and g(n) = n2 Now we have to show that f(n)  o(g(n)) Since f(n) < c.g(n)  n2 < c.n2  1 ≤ c, In our definition of small o, it was required to prove for any c but here there is a constraint over c . Hence, n2  o(n2), where c = 1 and n0= 1 34 Example
  • 35.                   o o n n n f n cg n c n f n g n g n g     all for 0 such that constant a exists there , constants positive any for : functions. all of set the by denote , function given a For           n g n f n lim Little-Omega Notation    .. 2 but 2 n e.g., 2 2 2 n n n     Little- notation is used to denote a lower bound that is not asymptotically tight. f(n) becomes arbitrarily large relative to g(n) as n approaches infinity 35
  • 36. Prove that 5.n2  (n) Proof: Assume that f(n) = 5.n2 , and g(n) = n f(n)  (g(n)) ? We have to prove that for any c there exists n0 such that c.g(n) < f(n) ꓯ n  n0 c.n < 5.n2  c < 5.n This is true for any c, because for any arbitrary c e.g. c = 1000000, we can choose n0 = 1000000/5 = 200000 and the above inequality does hold. And hence f(n)  (g(n)) 36 Example
  • 37. Prove that 5.n + 10  (n) Proof: Assume that f(n) = 5.n + 10, and g(n) = n f(n)  (g(n)) ? We have to find the existence n0 for any c, such that c.g(n) < f(n) ꓯ n  n0 c.n < 5.n + 10, if we take c = 16 then 16.n < 5.n + 10  11.n < 10 is not true for any positive integer. Hence f(n)  (g(n)) 37 Example
  • 38. Prove that 100.n  (n2) Proof: Let f(n) = 100.n, and g(n) = n2 Assume that f(n)  (g(n)) Now if f(n)  (g(n)) then there n0 for any c such that c.g(n) < f(n) ꓯ n  n0  c.n2 < 100.n  c.n < 100 If we take c = 100, n < 1, not possible Hence f(n)  (g(n)) i.e. 100.n  (n2) 38 Example
  • 39. If f(n) = Θ(g(n)) we say that f(n) and g(n) grow at the same rate, asymptotically If f(n) = O(g(n)) and f(n) ≠ Ω(g(n)), then we say that f(n) is asymptotically slower growing than g(n). If f(n) = Ω(g(n)) and f(n) ≠ O(g(n)), then we say that f(n) is asymptotically faster growing than g(n). 39 Asymptotic Functions Summary
  • 40. It is not always possible to determine behaviour of an algorithm using Θ-notation. For example, given a problem with n inputs, we may have an algorithm to solve it in a.n2 time when n is even and c.n time when n is odd. OR We may prove that an algorithm never uses more than e.n2 time and never less than f.n time. In either case we can neither claim (n) nor (n2) to be the order of the time usage of the algorithm. Big O and  notation will allow us to give at least partial information Usefulness of Notations 40
  • 41. To express the efficiency of our algorithms which of the three notations should we use? As computer scientist we generally like to express our algorithms as big O since we would like to know the upper bounds of our algorithms. Why? If we know the worse case then we can aim to improve it and/or avoid it. Usefulness of Notations 41
  • 42. Even though it is correct to say “7n - 3 is O(n3)”, a better statement is “7n - 3 is O(n)”, that is, one should make the approximation as tight as possible Simple Rule: Drop lower order terms and constant factors 7n-3 is O(n) 8n2log n + 5n2 + n is O(n2log n) Strictly speaking this use of the equals sign is incorrect • the relationship is a set inclusion, not an equality • f(n)  O(g(n)) is better Usefulness of Notations 42
  • 43. Big Oh Does Not Tell the Whole Story Question? • If two algorithms A and B have the same asymptotic complexity, say O(n2), will the execution time of the two algorithms always be same? • How to select between the two algorithms having the same asymptotic performance? Answer: • They may not be the same. There is this small matter of the constant of proportionality. • Suppose that A does ten operations for each data item, but algorithm B only does three. • It is reasonable to expect B to be faster than A even though both have the same asymptotic performance. The reason is that asymptotic analysis ignores constants of proportionality. 43
  • 44. Algorithm_A { set up the algorithm; /*taking 50 time units*/ read in n elements into array A; /* 3 units per element */ for (i = 0; i < n; i++) { do operation1 on A[i]; /* takes 10 units */ do operation2 on A[i]; /* takes 5 units */ do operation3 on A[i]; /* takes 15 units */ } } 44 Big Oh Does Not Tell the Whole Story TA(n) = 50 + 3n + (10 + 5 + 15)*n = 50 + 33*n Algorithm_B { set up the algorithm; /*taking 200 time units*/ read in n elements into array A; /* 3 units per element */ for (i = 0; i < n; i++) { do operation1 on A[i]; /* takes 10 units */ do operation2 on A[i]; /* takes 5 units */ } } TB(n) = 200 + 3n + (10 + 5)*n = 200 + 18*n
  • 45. Both algorithms have time complexity O(n). Algorithm A sets up faster than B, but does more operations on the data Algorithm A is the better choice for small values of n. For values of n > 10, algorithm B is the better choice 45 Big Oh Does Not Tell the Whole Story
  • 46. A Misconception A common misconception is that worst case running time is somehow defined by big-Oh, and that best case is defined by big-Omega. There is no formal relationship like this. However, worst case and big-Oh are commonly used together, because they are both techniques for finding an upper bound on running time. 46
  • 47. Relations over AsymptoticNotations Maximum rule: O(f(n)+g(n)) = O( max(f(n),g(n)) ) Additive and Multiplicative Property: If e(n) = O(g(n)) & f(n) = O(h(n)) then e(n) + f(n) = O(g(n) + h(n)) If e(n) = O(g(n)) & f(n) = O(h(n)) then e(n) • f(n) = O(g(n) • h(n)) Dichotomy Property: If f(n) = O(g(n)) & g(n) = O(f(n)) then f(n) = (g(n)) If f(n) = (g(n)) and g(n) = (f(n)) then f(n) = (g(n)) Reflexive Property: If f(n) = O(f(n)) and f(n) = (f(n)) and f(n) = (f(n)) f(n)  o(f(n)) and f(n)  (f(n)) Symmetry over : f(n) = (g(n))  g(n) = (f(n)) Transitivity Property: f(n) = (g(n)) & g(n) = (h(n))  f(n) = (h(n)) f(n) = O(g(n)) & g(n) = O(h(n))  f(n) = O(h(n)) f(n) = (g(n)) & g(n) = (h(n))  f(n) = (h(n)) f(n) = o (g(n)) & g(n) = o (h(n))  f(n) = o (h(n)) f(n) = (g(n)) & g(n) = (h(n))  f(n) = (h(n)) 47