SlideShare a Scribd company logo
Chapter 2
Introduction to
Algorithms
Dr. Muhammad Hanif Durad
Department of Computer and Information Sciences
Pakistan Institute Engineering and Applied Sciences
hanif@pieas.edu.pk
Some slides have bee adapted with thanks from some other lectures
available on Internet. It made my life easier, as life is always
miserable at PIEAS (Sir Muhammad Yusaf Kakakhil )
Dr. Hanif Durad 2
Lecture Outline
 Algorithm
 Analysis of Algorithms
 Computational Model
 Random Access Machine (RAM)
 Average, Worst, and Best Cases
 Higher order functions of n are normally considered less efficient
 Asymptotic Notation
 Q, O, W, o, w
 Why Does Growth Rate Matter?
Algorithm (1/2)
 Informally,
 A tool for solving a well-specified computational
problem.
 Example: sorting
input: A sequence of numbers.
output: An ordered permutation of the input.
AlgorithmInput Output
D:DSALCOMP 550-00101-algo.ppt
Dr. Hanif Durad
Algorithm (2/2)
 What is Algorithm?
 a clearly specified set of simple instructions to be
followed to solve a problem
 Takes a set of values, as input and
 produces a value, or set of values, as output
 Usually specified as a pseudo-code
 Data structures
 Methods of organizing data
 Program = algorithms + data structures
4
D:DSALCOMP171 Data Structures and Algorithmintro_algo.ppt
5
Analysis of Algorithms (1/2)
 Correctness:
 Does the algorithm do what is intended.
 Efficiecency:
 What is the running time of the algorithm.
 How much storage does it consume.
 Different algorithms may be correct
 Which should I use?
 Analysis of algorithms is to use mathematical
techniques to predict the efficiency of algorithms.
Dr. Hanif Durad
D:Data StructuresHanif_Searchch1intro.ppt+
D:DSALCD3570lecture1_introduction.pdf
CD3570
6
Analysis of Algorithms (2/2)
 What do we mean by efficiency?
 Efficiency is usually given with respect to some cost measure
 Cost measures are defined in terms of resource usage:
 Execution time
 Memory usage
 Communication bandwidth
 Computer hardware
 Energy consumption
 etc.
 We will mainly look at cost in terms of execution time
Dr. Hanif Durad
D:Data StructuresHanif_Searchch1intro.ppt+
D:DSALCD3570lecture1_introduction.pdf
Running-time of algorithms
 Bounds are for the algorithms, rather than
programs
 programs are just implementations of an algorithm, and
almost always the details of the program do not affect
the bounds
 Bounds are for algorithms, rather than problems
 A problem can be solved with several algorithms, some
are more efficient than others
D:Data StructuresCOMP171 Data Structures and Algorithmalgo.ppt
But, how to measure the time?
 Multiplication and addition: which one takes longer?
 How do we measure >=, assignment, &&, ||, etc etc
Machine dependent?
What is the efficiency of an
algorithm?
Run time in the computer: Machine Dependent
Example: Need to multiply two positive integers a and b
Subroutine 1: Multiply a and b
Subroutine 2: V = a, W = b
While W > 1
V V + a; W W-1
Output V
Solution: Machine Independent
Analysis
We assume that every basic operation takes constant time:
Example Basic Operations:
Addition, Subtraction, Multiplication, Memory Access
Non-basic Operations:
Sorting, Searching
Efficiency of an algorithm is the number of basic
operations it performs
We do not distinguish between the basic operations.
Subroutine 1 uses ? basic operation
Subroutine 2 uses ? basic operations
Subroutine ? is more efficient.
This measure is good for all large input sizes
In fact, we will not worry about the exact values, but will look at ``broad
classes’ of values, or the growth rates
Let there be n inputs.
If an algorithm needs n basic operations and another needs 2n basic
operations, we will consider them to be in the same efficiency
category.
However, we distinguish between exp(n), n, log(n)
Computational Model
 Should be simple, or even simplistic.
 Assign uniform cost for all simple operations and
memory accesses. (Not true in practice.)
 Question: Is this OK?
 Should be widely applicable.
 Can’t assume the model to support complex
operations. Ex: No SORT instruction.
 Size of a word of data is finite.
 Why? Dr. Hanif Durad 12
D:DSALCOMP 550-00101-algo.ppt
Random Access Machine (RAM)
 Generic single-processor model.
 Supports simple constant-time instructions found in real
computers.
 Arithmetic (+, –, *, /, %, floor, ceiling).
 Data Movement (load, store, copy).
 Control (branch, subroutine call).
 Run time (cost) is uniform (1 time unit) for all simple
instructions.
 Memory is unlimited.
 Flat memory model – no hierarchy.
 Access to a word of memory takes 1 time unit.
 Sequential execution – no concurrent operations. 13
D:DSALCOMP 550-00101-algo.ppt
14
Complexity
 Complexity is the number of steps required to solve a problem.
 The goal is to find the best algorithm to solve the problem with
a less number of steps
 Complexity of Algorithms
 The size of the problem is a measure of the quantity of the input data n
 The time needed by an algorithm, expressed as a function of the size
of the problem (it solves), is called the (time) complexity of the
algorithm T(n)
D:DSALAlgorithms and computational complexity
03_Growth_of_Functions_1.ppt, P-3
Dr. Hanif Durad
15
Basic idea: counting operations
 Running Time: Number of primitive steps that are executed
 most statements roughly require the same amount of time
 y = m * x + b
 c = 5 / 9 * (t - 32 )
 z = f(x) + g(y)
 Each algorithm performs a sequence of basic operations:
 Arithmetic: (low + high)/2
 Comparison: if ( x > 0 ) …
 Assignment: temp = x
 Branching: while ( true ) { … }
 …
Dr. Hanif Durad
16
Basic idea: counting operations
 Idea: count the number of basic operations
performed on the input.
 Difficulties:
 Which operations are basic?
 Not all operations take the same amount of time.
 Operations take different times with different
hardware or compilers
Dr. Hanif Durad
17
Measures of Algorithm
Complexity
 Let T(n) denote the number of operations required by an
algorithm to solve a given class of problems
 Often T(n) depends on the input, in such cases one can talk
about
 Worst-case complexity,
 Best-case complexity,
 Average-case complexity of an algorithm
 Alternatively, one can determine bounds (upper or lower)
on T(n)
Dr. Hanif Durad
18
Measures of Algorithm
Complexity
 Worst-Case Running Time: the longest time for any input
size of n
 provides an upper bound on running time for any input
 Best-Case Running Time: the shortest time for any input
size of n
 provides lower bound on running time for any input
 Average-Case Behavior: the expected performance
averaged over all possible inputs
 it is generally better than worst case behavior, but sometimes it’s
roughly as bad as worst case
 difficult to compute
Dr. Hanif Durad
Average, Worst, and Best Cases
 An algorithm may run faster on certain data sets
than others.
 Finding the average case can be very difficult,
so typically algorithms are measured in the
worst case time complexity.
 Also, in certain application domains (e.g., air
traffic control, medical, etc.) knowing the worst
case time complexity is of crucial importance.
Dr. Hanif Durad 19
D:Data StructuresICS202Lecture05.ppt
Worst vs. Average Case
Dr. Hanif Durad 20
D:Data StructuresICS202Lecture05.ppt
21
Example 1: Sum Series
Algorithm Step Count
1
2
3
4
1
2n+2
4n
1
Total 6n + 4
3
1
N
i
i


 Lines 1 and 4 count for one unit each
 Line 3: executed N times, each time four units
 Line 2: (1 for initialization, N+1 for all the tests, N for all the
increments) total 2N + 2
 total cost: 6N + 4  O(N)
D:Data StructuresCOMP171 Data Structures and Algorithmalgo.ppt
22
Example 2: Sequential Search
Algorithm Step Count
// Searches for x in array A of n items
// returns index of found item, or n+1 if not found
Seq_Search( A[n]: array, x: item){
done = false
i = 1
while ((i <= n) and (A[i] <> x)){
i = i +1
}
return i
}
0
1
1
n + 1
n
0
1
0
Total 2n + 4
23
Example: Sequential Search
 worst-case running time
 when x is not in the original array A
 in this case, while loop needs 2(n + 1) comparisons + c other
operations
 So, T(n) = 2n + 2 + c  Linear complexity
 best-case running time
 when x is found in A[1]
 in this case, while loop needs 2 comparisons + c other operations
 So, T(n) = 2 + c  Constant complexity
Dr. Hanif Durad
24
Order of Growth
 For very large input size, it is the rate of grow, or order of
growth that matters asymptotically
 We can ignore the lower-order terms, since they are
relatively insignificant for very large n
 We can also ignore leading term’s constant coefficients,
since they are not as important for the rate of growth in
computational efficiency for very large n
 Higher order functions of n are normally considered less
efficient
Dr. Hanif Durad
25
Asymptotic Notation
 Q, O, W, o, w
 Used to describe the running times of algorithms
 Instead of exact running time, say Q(n2)
 Defined for functions whose domain is the set of natural
numbers, N
 Determine sets of functions, in practice used to compare two
functions
Dr. Hanif Durad
26
Asymptotic Notation
 By now you should have an intuitive feel for
asymptotic (big-O) notation:
 What does O(n) running time mean? O(n2)?
O(n lg n)?
 Our first task is to define this notation more
formally and completely
Dr. Hanif Durad
27
Big-O notation
(Upper Bound – Worst Case)
 For a given function g(n), we denote by O(g(n)) the set of functions
 O(g(n)) = {f(n): there exist positive constants c >0 and n0 >0 such that
0  f(n)  cg(n) for all n  n0 }
 We say g(n) is an asymptotic upper bound for f(n):
 O(g(n)) means that as n  , the execution time f(n) is at most c.g(n)
for some constant c
 What does O(g(n)) running time mean?
 The worst-case running time (upper-bound) is a function of g(n) to a
within a constant factor
 
)(
)(
lim0
ng
nf
n
Dr. Hanif Durad
28
Big-O notation
(Upper Bound – Worst Case)
time
nn0
f(n)
c.g(n)
f(n) = O(g(n))
Dr. Hanif Durad
29
O-notation
For a given function g(n), we
denote by O(g(n)) the set of
functions
O(g(n)) = {f(n): there exist
positive constants c and n0 such
that
0  f(n)  cg(n),
for all n  n0 }
We say g(n) is an asymptotic upper bound for f(n)
30
Big-O notation
(Upper Bound – Worst Case)
 This is a mathematically formal way of ignoring constant
factors, and looking only at the “shape” of the function
 f(n)=O(g(n)) should be considered as saying that “f(n) is at
most g(n), up to constant factors”.
 We usually will have f(n) be the running time of an
algorithm and g(n) a nicely written function
 e.g. The running time of insertion sort algorithm is O(n2)
 Example: 2n2 = O(n3), with c = 1 and n0 = 2.
31
Examples of functions in O(n2)
 n2
 n2 + n
 n2 + 1000n
 1000n2 + 1000n
Also,
 n
 n/1000
 n1.99999
 n2/ lg lg lg n
32
 Example1: Is 2n + 7 = O(n)?
 Let
 T(n) = 2n + 7
 T(n) = n (2 + 7/n)
 Note for n=7;
 2 + 7/n = 2 + 7/7 = 3
 T(n)  3 n ;  n  7
 Then T(n) = O(n)
 lim n [T(n) / n)] = 2  0  T(n) = O(n)
Big-O notation
(Upper Bound – Worst Case)
c
n0
33
Big-O notation
(Upper Bound – Worst Case)
 Example2: Is 5n3 + 2n2 + n + 106 = O(n3)?
 Let
 T(n) = 5n3 + 2n2 + n + 106
 T(n) = n3 (5 + 2/n + 1/n2 + 106/n3)
 Note for n=100;
 5 + 2/n + 1/n2 + 106/n3 =
 5 + 2/100 + 1/10000 + 1 = 6.05
 T(n)  6.05 n3 ;  n  100 n0
c
 Then T(n) = O(n3)
 limn[T(n) / n3)] = 5  0  T(n) = O(n3)
34
Big-O notation
(Upper Bound – Worst Case)
 Express the execution time as a function of the input size n
 Since only the growth rate matters, we can ignore the multiplicative
constants and the lower order terms, e.g.,
 n, n+1, n+80, 40n, n+log n is O(n)
 n1.1 + 10000000000n is O(n1.1)
 n2 is O(n2)
 3n2 + 6n + log n + 24.5 is O(n2)
 O(1) < O(log n) < O((log n)3) < O(n) < O(n2) < O(n3) < O(nlog n) <
O(2sqrt(n)) < O(2n) < O(n!) < O(nn)
 Constant < Logarithmic < Linear < Quadratic< Cubic < Polynomial <
Factorial < Exponential
35
W-notation (Omega)
(Lower Bound – Best Case)
 For a given function g(n), we denote by W(g(n)) the set of functions
 W(g(n)) = {f(n): there exist positive constants c >0 and n0 >0 such
that 0  cg(n)  f(n) for all n  n0 }
 We say g(n) is an asymptotic lower bound for f(n):
 W(g(n)) means that as n  , the execution time f(n) is at least
c.g(n) for some constant c
 What does W(g(n)) running time mean?
 The best-case running time (lower-bound) is a function of g(n) to a
within a constant factor
 
)(
)(
lim0
ng
nf
n
36
W-notation
(Lower Bound – Best Case)
c.g(n)
time
nn0
f(n)
f(n) = W(g(n))
37
W-notation
For a given function g(n), we
denote by W(g(n)) the set of
functions
W(g(n)) = {f(n): there exist
positive constants c and n0
such that
0  cg(n)  f(n)
for all n  n0 }
We say g(n) is an asymptotic lower bound for f(n)
38
W-notation (Omega)
(Lower Bound – Best Case)
 We say Insertion Sort’s run time T(n) is W(n)
 For example
 the worst-case running time of insertion sort is O(n2),
and
 the best-case running time of insertion sort is W(n)
 Running time falls anywhere between a linear
function of n and a quadratic function of n2
 Example: √n = W(lg n), with c = 1 and n0 = 16.
39
Examples of functions in W(n2)
 n2
 n2 + n
 n2 − n
 1000n2 + 1000n
 1000n2 − 1000n
Also,
 n3
 n2.00001
 n2 lg lg lg n
40
Q notation (Theta)
(Tight Bound)
 In some cases,
 f(n) = O(g(n)) and f(n) = W(g(n))
 This means, that the worst and best cases require the
same amount of time t within a constant factor
 In this case we use a new notation called “theta Q”
 For a given function g(n), we denote by Q(g(n))
the set of functions
 Q(g(n)) = {f(n): there exist positive constants c1>0, c2
>0 and n0 >0 such that
 c g(n)  f(n)  c g(n)  n  n }
41
Q notation (Theta)
(Tight Bound)
 We say g(n) is an asymptotic tight bound for f(n):
 Theta notation
 (g(n)) means that as n  , the execution time f(n) is at most
c2.g(n) and at least c1.g(n) for some constants c1 and c2.
 f(n) = Q(g(n)) if and only if
 f(n) = O(g(n)) & f(n) = W(g(n))
 
)(
)(
lim0
ng
nf
n
42
Q notation (Theta)
(Tight Bound)
time
nn0
c1.g(n)
f(n)
f(n) = Q(g(n))
c2.g(n)
43
Q notation (Theta)
(Tight Bound)
 Example:
n2/2 − 2n = Q(n2), with c1 = 1/4, c2 = 1/2, and
n0 = 8.
44
o-notation
 For a given function g(n), we denote by o(g(n)) the set
of functions:
o(g(n)) = {f(n): for any positive constant c > 0, there
exists a constant n0 > 0 such that 0  f(n) < cg(n) for
all n  n0 }
 f(n) becomes insignificant relative to g(n) as n
approaches infinity: lim [f(n) / g(n)] = 0n
 We say g(n) is an upper bound for f(n) that is not
asymptotically tight.
45
O(*) versus o(*)
O(g(n)) = {f(n): there exist positive constants c and n0 such that 0
 f(n)  cg(n), for all n  n0 }.
o(g(n)) = {f(n): for any positive constant c > 0, there exists a
constant n0 > 0 such that 0  f(n) < cg(n) for all n  n0 }.
Thus o(f(n)) is a weakened O(f(n)).
For example: n2 = O(n2)
n2  o(n2)
n2 = O(n3)
n2 = o(n3)
46
o-notation
 n1.9999 = o(n2)
 n2/ lg n = o(n2)
 n2  o(n2) (just like 2< 2)
 n2/1000  o(n2)
47
w-notation
 For a given function g(n), we denote by w(g(n)) the
set of functions
w(g(n)) = {f(n): for any positive constant c > 0, there
exists a constant n0 > 0 such that 0  cg(n) < f(n) for
all n  n0 }
 f(n) becomes arbitrarily large relative to g(n) as n
approaches infinity: lim [f(n) / g(n)] = 
n
 We say g(n) is a lower bound for f(n) that is not
asymptotically tight.
48
w-notation
 n2.0001 = ω(n2)
 n2 lg n = ω(n2)
 n2  ω(n2)
49
Comparison of Functions
f  g  a  b
f (n) = O(g(n))  a  b
f (n) = W(g(n))  a  b
f (n) = Q(g(n))  a = b
f (n) = o(g(n))  a < b
f (n) = w (g(n))  a > b
Why Does Growth Rate Matter?
Complexity 10 20 30
n 0.00001 sec 0.00002 sec 0.00003 sec
n2 0.0001 sec 0.0004 sec 0.0009 sec
n3 0.001 sec 0.008 sec 0.027 sec
n5 0.1 sec 3.2 sec 24.3 sec
2n 0.001 sec 1.0 sec 17.9 min
3n 0.59 sec 58 min 6.5 years
Why Does Growth Rate Matter?
Complexity 40 50 60
n 0.00004 sec 0.00005 sec 0.00006 sec
n2 0.016 sec 0.025 sec 0.036 sec
n3 0.064 sec 0.125 sec 0.216 sec
n5 1.7 min 5.2 min 13.0 min
2n 12.7 days 35.7 years 366 cent
3n 3855 cent 2 x 108 cent 1.3 x 1013 cent

More Related Content

PPTX
Knapsack Problem
PPTX
Chapter 09 design and analysis of algorithms
PDF
Algorithms Lecture 2: Analysis of Algorithms I
PPT
Asymptotic analysis
PPTX
Shift reduce parser
PDF
Operator precedence
PPT
CS8461 - Design and Analysis of Algorithms
PPT
Divide and conquer
Knapsack Problem
Chapter 09 design and analysis of algorithms
Algorithms Lecture 2: Analysis of Algorithms I
Asymptotic analysis
Shift reduce parser
Operator precedence
CS8461 - Design and Analysis of Algorithms
Divide and conquer

What's hot (20)

PPTX
Binary Heap Tree, Data Structure
DOCX
AI Lab Manual.docx
PPTX
Priority Queue in Data Structure
PPTX
Travelling salesman dynamic programming
PPT
Regular expressions-Theory of computation
PPT
Red black tree
PPSX
PDF
Design and analysis of algorithms
PPTX
The n Queen Problem
PDF
P, NP, NP-Complete, and NP-Hard
PDF
Bottom up parser
PPTX
Divide and conquer - Quick sort
PPTX
Mathematical Analysis of Recursive Algorithm.
PPTX
Asymptotic Notation
PPTX
0 1 knapsack using branch and bound
PPTX
Input-Buffering
PPT
Sum of subsets problem by backtracking 
PPTX
Syntax Analysis in Compiler Design
PDF
Algorithmic problem solving
Binary Heap Tree, Data Structure
AI Lab Manual.docx
Priority Queue in Data Structure
Travelling salesman dynamic programming
Regular expressions-Theory of computation
Red black tree
Design and analysis of algorithms
The n Queen Problem
P, NP, NP-Complete, and NP-Hard
Bottom up parser
Divide and conquer - Quick sort
Mathematical Analysis of Recursive Algorithm.
Asymptotic Notation
0 1 knapsack using branch and bound
Input-Buffering
Sum of subsets problem by backtracking 
Syntax Analysis in Compiler Design
Algorithmic problem solving
Ad

Similar to Chapter 2 ds (20)

PDF
Data Structure - Lecture 1 - Introduction.pdf
PPT
01-algo.ppt
PDF
DATA STRUCTURE
PDF
DATA STRUCTURE.pdf
PPTX
Presentation_23953_Content_Document_20240906040454PM.pptx
PPTX
BCSE202Lkkljkljkbbbnbnghghjghghghghghghghgh
PPTX
Chapter two
PDF
Algorithm Analysis.pdf
PPT
Data Structure and Algorithm chapter two, This material is for Data Structure...
PPTX
Module-1.pptxbdjdhcdbejdjhdbchchchchchjcjcjc
PDF
Daa chapter 1
PPT
CS3114_09212011.ppt
PPT
Big Oh.ppt
PPT
Data Structures and Algorithm Analysis
PPTX
Data Structure Algorithm -Algorithm Complexity
PDF
Problem solving using computers - Unit 1 - Study material
PDF
2-Algorithms and Complexit data structurey.pdf
PPTX
DAA-Unit1.pptx
PPTX
Design & Analysis of Algorithm course .pptx
PPT
AA Lecture 01 of my lecture os ghhhggh.ppt
Data Structure - Lecture 1 - Introduction.pdf
01-algo.ppt
DATA STRUCTURE
DATA STRUCTURE.pdf
Presentation_23953_Content_Document_20240906040454PM.pptx
BCSE202Lkkljkljkbbbnbnghghjghghghghghghghgh
Chapter two
Algorithm Analysis.pdf
Data Structure and Algorithm chapter two, This material is for Data Structure...
Module-1.pptxbdjdhcdbejdjhdbchchchchchjcjcjc
Daa chapter 1
CS3114_09212011.ppt
Big Oh.ppt
Data Structures and Algorithm Analysis
Data Structure Algorithm -Algorithm Complexity
Problem solving using computers - Unit 1 - Study material
2-Algorithms and Complexit data structurey.pdf
DAA-Unit1.pptx
Design & Analysis of Algorithm course .pptx
AA Lecture 01 of my lecture os ghhhggh.ppt
Ad

More from Hanif Durad (20)

PPT
Chapter 26 aoa
PPT
Chapter 25 aoa
PPT
Chapter 24 aoa
PPT
Chapter 23 aoa
PPT
Chapter 12 ds
PPT
Chapter 11 ds
PPT
Chapter 10 ds
PPT
Chapter 9 ds
PPT
Chapter 8 ds
PPT
Chapter 7 ds
PPT
Chapter 6 ds
PPT
Chapter 5 ds
PPT
Chapter 4 ds
PPT
Chapter 3 ds
PPT
Chapter 5 pc
PPT
Chapter 4 pc
PPT
Chapter 3 pc
PPT
Chapter 2 pc
PPT
Chapter 1 pc
PPT
Chapter 6 pc
Chapter 26 aoa
Chapter 25 aoa
Chapter 24 aoa
Chapter 23 aoa
Chapter 12 ds
Chapter 11 ds
Chapter 10 ds
Chapter 9 ds
Chapter 8 ds
Chapter 7 ds
Chapter 6 ds
Chapter 5 ds
Chapter 4 ds
Chapter 3 ds
Chapter 5 pc
Chapter 4 pc
Chapter 3 pc
Chapter 2 pc
Chapter 1 pc
Chapter 6 pc

Recently uploaded (20)

PPTX
Cell Structure & Organelles in detailed.
PPTX
Introduction to Child Health Nursing – Unit I | Child Health Nursing I | B.Sc...
PPTX
Pharma ospi slides which help in ospi learning
PDF
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
PDF
O5-L3 Freight Transport Ops (International) V1.pdf
PDF
Basic Mud Logging Guide for educational purpose
PDF
Supply Chain Operations Speaking Notes -ICLT Program
PDF
Classroom Observation Tools for Teachers
PPTX
PPH.pptx obstetrics and gynecology in nursing
PDF
Physiotherapy_for_Respiratory_and_Cardiac_Problems WEBBER.pdf
PDF
TR - Agricultural Crops Production NC III.pdf
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PPTX
Institutional Correction lecture only . . .
PDF
Module 4: Burden of Disease Tutorial Slides S2 2025
PDF
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
PDF
Insiders guide to clinical Medicine.pdf
PDF
BÀI TẬP BỔ TRỢ 4 KỸ NĂNG TIẾNG ANH 9 GLOBAL SUCCESS - CẢ NĂM - BÁM SÁT FORM Đ...
PPTX
Week 4 Term 3 Study Techniques revisited.pptx
PDF
Mark Klimek Lecture Notes_240423 revision books _173037.pdf
PDF
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
Cell Structure & Organelles in detailed.
Introduction to Child Health Nursing – Unit I | Child Health Nursing I | B.Sc...
Pharma ospi slides which help in ospi learning
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
O5-L3 Freight Transport Ops (International) V1.pdf
Basic Mud Logging Guide for educational purpose
Supply Chain Operations Speaking Notes -ICLT Program
Classroom Observation Tools for Teachers
PPH.pptx obstetrics and gynecology in nursing
Physiotherapy_for_Respiratory_and_Cardiac_Problems WEBBER.pdf
TR - Agricultural Crops Production NC III.pdf
Final Presentation General Medicine 03-08-2024.pptx
Institutional Correction lecture only . . .
Module 4: Burden of Disease Tutorial Slides S2 2025
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
Insiders guide to clinical Medicine.pdf
BÀI TẬP BỔ TRỢ 4 KỸ NĂNG TIẾNG ANH 9 GLOBAL SUCCESS - CẢ NĂM - BÁM SÁT FORM Đ...
Week 4 Term 3 Study Techniques revisited.pptx
Mark Klimek Lecture Notes_240423 revision books _173037.pdf
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf

Chapter 2 ds

  • 1. Chapter 2 Introduction to Algorithms Dr. Muhammad Hanif Durad Department of Computer and Information Sciences Pakistan Institute Engineering and Applied Sciences hanif@pieas.edu.pk Some slides have bee adapted with thanks from some other lectures available on Internet. It made my life easier, as life is always miserable at PIEAS (Sir Muhammad Yusaf Kakakhil )
  • 2. Dr. Hanif Durad 2 Lecture Outline  Algorithm  Analysis of Algorithms  Computational Model  Random Access Machine (RAM)  Average, Worst, and Best Cases  Higher order functions of n are normally considered less efficient  Asymptotic Notation  Q, O, W, o, w  Why Does Growth Rate Matter?
  • 3. Algorithm (1/2)  Informally,  A tool for solving a well-specified computational problem.  Example: sorting input: A sequence of numbers. output: An ordered permutation of the input. AlgorithmInput Output D:DSALCOMP 550-00101-algo.ppt Dr. Hanif Durad
  • 4. Algorithm (2/2)  What is Algorithm?  a clearly specified set of simple instructions to be followed to solve a problem  Takes a set of values, as input and  produces a value, or set of values, as output  Usually specified as a pseudo-code  Data structures  Methods of organizing data  Program = algorithms + data structures 4 D:DSALCOMP171 Data Structures and Algorithmintro_algo.ppt
  • 5. 5 Analysis of Algorithms (1/2)  Correctness:  Does the algorithm do what is intended.  Efficiecency:  What is the running time of the algorithm.  How much storage does it consume.  Different algorithms may be correct  Which should I use?  Analysis of algorithms is to use mathematical techniques to predict the efficiency of algorithms. Dr. Hanif Durad D:Data StructuresHanif_Searchch1intro.ppt+ D:DSALCD3570lecture1_introduction.pdf CD3570
  • 6. 6 Analysis of Algorithms (2/2)  What do we mean by efficiency?  Efficiency is usually given with respect to some cost measure  Cost measures are defined in terms of resource usage:  Execution time  Memory usage  Communication bandwidth  Computer hardware  Energy consumption  etc.  We will mainly look at cost in terms of execution time Dr. Hanif Durad D:Data StructuresHanif_Searchch1intro.ppt+ D:DSALCD3570lecture1_introduction.pdf
  • 7. Running-time of algorithms  Bounds are for the algorithms, rather than programs  programs are just implementations of an algorithm, and almost always the details of the program do not affect the bounds  Bounds are for algorithms, rather than problems  A problem can be solved with several algorithms, some are more efficient than others D:Data StructuresCOMP171 Data Structures and Algorithmalgo.ppt
  • 8. But, how to measure the time?  Multiplication and addition: which one takes longer?  How do we measure >=, assignment, &&, ||, etc etc Machine dependent?
  • 9. What is the efficiency of an algorithm? Run time in the computer: Machine Dependent Example: Need to multiply two positive integers a and b Subroutine 1: Multiply a and b Subroutine 2: V = a, W = b While W > 1 V V + a; W W-1 Output V
  • 10. Solution: Machine Independent Analysis We assume that every basic operation takes constant time: Example Basic Operations: Addition, Subtraction, Multiplication, Memory Access Non-basic Operations: Sorting, Searching Efficiency of an algorithm is the number of basic operations it performs We do not distinguish between the basic operations.
  • 11. Subroutine 1 uses ? basic operation Subroutine 2 uses ? basic operations Subroutine ? is more efficient. This measure is good for all large input sizes In fact, we will not worry about the exact values, but will look at ``broad classes’ of values, or the growth rates Let there be n inputs. If an algorithm needs n basic operations and another needs 2n basic operations, we will consider them to be in the same efficiency category. However, we distinguish between exp(n), n, log(n)
  • 12. Computational Model  Should be simple, or even simplistic.  Assign uniform cost for all simple operations and memory accesses. (Not true in practice.)  Question: Is this OK?  Should be widely applicable.  Can’t assume the model to support complex operations. Ex: No SORT instruction.  Size of a word of data is finite.  Why? Dr. Hanif Durad 12 D:DSALCOMP 550-00101-algo.ppt
  • 13. Random Access Machine (RAM)  Generic single-processor model.  Supports simple constant-time instructions found in real computers.  Arithmetic (+, –, *, /, %, floor, ceiling).  Data Movement (load, store, copy).  Control (branch, subroutine call).  Run time (cost) is uniform (1 time unit) for all simple instructions.  Memory is unlimited.  Flat memory model – no hierarchy.  Access to a word of memory takes 1 time unit.  Sequential execution – no concurrent operations. 13 D:DSALCOMP 550-00101-algo.ppt
  • 14. 14 Complexity  Complexity is the number of steps required to solve a problem.  The goal is to find the best algorithm to solve the problem with a less number of steps  Complexity of Algorithms  The size of the problem is a measure of the quantity of the input data n  The time needed by an algorithm, expressed as a function of the size of the problem (it solves), is called the (time) complexity of the algorithm T(n) D:DSALAlgorithms and computational complexity 03_Growth_of_Functions_1.ppt, P-3 Dr. Hanif Durad
  • 15. 15 Basic idea: counting operations  Running Time: Number of primitive steps that are executed  most statements roughly require the same amount of time  y = m * x + b  c = 5 / 9 * (t - 32 )  z = f(x) + g(y)  Each algorithm performs a sequence of basic operations:  Arithmetic: (low + high)/2  Comparison: if ( x > 0 ) …  Assignment: temp = x  Branching: while ( true ) { … }  … Dr. Hanif Durad
  • 16. 16 Basic idea: counting operations  Idea: count the number of basic operations performed on the input.  Difficulties:  Which operations are basic?  Not all operations take the same amount of time.  Operations take different times with different hardware or compilers Dr. Hanif Durad
  • 17. 17 Measures of Algorithm Complexity  Let T(n) denote the number of operations required by an algorithm to solve a given class of problems  Often T(n) depends on the input, in such cases one can talk about  Worst-case complexity,  Best-case complexity,  Average-case complexity of an algorithm  Alternatively, one can determine bounds (upper or lower) on T(n) Dr. Hanif Durad
  • 18. 18 Measures of Algorithm Complexity  Worst-Case Running Time: the longest time for any input size of n  provides an upper bound on running time for any input  Best-Case Running Time: the shortest time for any input size of n  provides lower bound on running time for any input  Average-Case Behavior: the expected performance averaged over all possible inputs  it is generally better than worst case behavior, but sometimes it’s roughly as bad as worst case  difficult to compute Dr. Hanif Durad
  • 19. Average, Worst, and Best Cases  An algorithm may run faster on certain data sets than others.  Finding the average case can be very difficult, so typically algorithms are measured in the worst case time complexity.  Also, in certain application domains (e.g., air traffic control, medical, etc.) knowing the worst case time complexity is of crucial importance. Dr. Hanif Durad 19 D:Data StructuresICS202Lecture05.ppt
  • 20. Worst vs. Average Case Dr. Hanif Durad 20 D:Data StructuresICS202Lecture05.ppt
  • 21. 21 Example 1: Sum Series Algorithm Step Count 1 2 3 4 1 2n+2 4n 1 Total 6n + 4 3 1 N i i    Lines 1 and 4 count for one unit each  Line 3: executed N times, each time four units  Line 2: (1 for initialization, N+1 for all the tests, N for all the increments) total 2N + 2  total cost: 6N + 4  O(N) D:Data StructuresCOMP171 Data Structures and Algorithmalgo.ppt
  • 22. 22 Example 2: Sequential Search Algorithm Step Count // Searches for x in array A of n items // returns index of found item, or n+1 if not found Seq_Search( A[n]: array, x: item){ done = false i = 1 while ((i <= n) and (A[i] <> x)){ i = i +1 } return i } 0 1 1 n + 1 n 0 1 0 Total 2n + 4
  • 23. 23 Example: Sequential Search  worst-case running time  when x is not in the original array A  in this case, while loop needs 2(n + 1) comparisons + c other operations  So, T(n) = 2n + 2 + c  Linear complexity  best-case running time  when x is found in A[1]  in this case, while loop needs 2 comparisons + c other operations  So, T(n) = 2 + c  Constant complexity Dr. Hanif Durad
  • 24. 24 Order of Growth  For very large input size, it is the rate of grow, or order of growth that matters asymptotically  We can ignore the lower-order terms, since they are relatively insignificant for very large n  We can also ignore leading term’s constant coefficients, since they are not as important for the rate of growth in computational efficiency for very large n  Higher order functions of n are normally considered less efficient Dr. Hanif Durad
  • 25. 25 Asymptotic Notation  Q, O, W, o, w  Used to describe the running times of algorithms  Instead of exact running time, say Q(n2)  Defined for functions whose domain is the set of natural numbers, N  Determine sets of functions, in practice used to compare two functions Dr. Hanif Durad
  • 26. 26 Asymptotic Notation  By now you should have an intuitive feel for asymptotic (big-O) notation:  What does O(n) running time mean? O(n2)? O(n lg n)?  Our first task is to define this notation more formally and completely Dr. Hanif Durad
  • 27. 27 Big-O notation (Upper Bound – Worst Case)  For a given function g(n), we denote by O(g(n)) the set of functions  O(g(n)) = {f(n): there exist positive constants c >0 and n0 >0 such that 0  f(n)  cg(n) for all n  n0 }  We say g(n) is an asymptotic upper bound for f(n):  O(g(n)) means that as n  , the execution time f(n) is at most c.g(n) for some constant c  What does O(g(n)) running time mean?  The worst-case running time (upper-bound) is a function of g(n) to a within a constant factor   )( )( lim0 ng nf n Dr. Hanif Durad
  • 28. 28 Big-O notation (Upper Bound – Worst Case) time nn0 f(n) c.g(n) f(n) = O(g(n)) Dr. Hanif Durad
  • 29. 29 O-notation For a given function g(n), we denote by O(g(n)) the set of functions O(g(n)) = {f(n): there exist positive constants c and n0 such that 0  f(n)  cg(n), for all n  n0 } We say g(n) is an asymptotic upper bound for f(n)
  • 30. 30 Big-O notation (Upper Bound – Worst Case)  This is a mathematically formal way of ignoring constant factors, and looking only at the “shape” of the function  f(n)=O(g(n)) should be considered as saying that “f(n) is at most g(n), up to constant factors”.  We usually will have f(n) be the running time of an algorithm and g(n) a nicely written function  e.g. The running time of insertion sort algorithm is O(n2)  Example: 2n2 = O(n3), with c = 1 and n0 = 2.
  • 31. 31 Examples of functions in O(n2)  n2  n2 + n  n2 + 1000n  1000n2 + 1000n Also,  n  n/1000  n1.99999  n2/ lg lg lg n
  • 32. 32  Example1: Is 2n + 7 = O(n)?  Let  T(n) = 2n + 7  T(n) = n (2 + 7/n)  Note for n=7;  2 + 7/n = 2 + 7/7 = 3  T(n)  3 n ;  n  7  Then T(n) = O(n)  lim n [T(n) / n)] = 2  0  T(n) = O(n) Big-O notation (Upper Bound – Worst Case) c n0
  • 33. 33 Big-O notation (Upper Bound – Worst Case)  Example2: Is 5n3 + 2n2 + n + 106 = O(n3)?  Let  T(n) = 5n3 + 2n2 + n + 106  T(n) = n3 (5 + 2/n + 1/n2 + 106/n3)  Note for n=100;  5 + 2/n + 1/n2 + 106/n3 =  5 + 2/100 + 1/10000 + 1 = 6.05  T(n)  6.05 n3 ;  n  100 n0 c  Then T(n) = O(n3)  limn[T(n) / n3)] = 5  0  T(n) = O(n3)
  • 34. 34 Big-O notation (Upper Bound – Worst Case)  Express the execution time as a function of the input size n  Since only the growth rate matters, we can ignore the multiplicative constants and the lower order terms, e.g.,  n, n+1, n+80, 40n, n+log n is O(n)  n1.1 + 10000000000n is O(n1.1)  n2 is O(n2)  3n2 + 6n + log n + 24.5 is O(n2)  O(1) < O(log n) < O((log n)3) < O(n) < O(n2) < O(n3) < O(nlog n) < O(2sqrt(n)) < O(2n) < O(n!) < O(nn)  Constant < Logarithmic < Linear < Quadratic< Cubic < Polynomial < Factorial < Exponential
  • 35. 35 W-notation (Omega) (Lower Bound – Best Case)  For a given function g(n), we denote by W(g(n)) the set of functions  W(g(n)) = {f(n): there exist positive constants c >0 and n0 >0 such that 0  cg(n)  f(n) for all n  n0 }  We say g(n) is an asymptotic lower bound for f(n):  W(g(n)) means that as n  , the execution time f(n) is at least c.g(n) for some constant c  What does W(g(n)) running time mean?  The best-case running time (lower-bound) is a function of g(n) to a within a constant factor   )( )( lim0 ng nf n
  • 36. 36 W-notation (Lower Bound – Best Case) c.g(n) time nn0 f(n) f(n) = W(g(n))
  • 37. 37 W-notation For a given function g(n), we denote by W(g(n)) the set of functions W(g(n)) = {f(n): there exist positive constants c and n0 such that 0  cg(n)  f(n) for all n  n0 } We say g(n) is an asymptotic lower bound for f(n)
  • 38. 38 W-notation (Omega) (Lower Bound – Best Case)  We say Insertion Sort’s run time T(n) is W(n)  For example  the worst-case running time of insertion sort is O(n2), and  the best-case running time of insertion sort is W(n)  Running time falls anywhere between a linear function of n and a quadratic function of n2  Example: √n = W(lg n), with c = 1 and n0 = 16.
  • 39. 39 Examples of functions in W(n2)  n2  n2 + n  n2 − n  1000n2 + 1000n  1000n2 − 1000n Also,  n3  n2.00001  n2 lg lg lg n
  • 40. 40 Q notation (Theta) (Tight Bound)  In some cases,  f(n) = O(g(n)) and f(n) = W(g(n))  This means, that the worst and best cases require the same amount of time t within a constant factor  In this case we use a new notation called “theta Q”  For a given function g(n), we denote by Q(g(n)) the set of functions  Q(g(n)) = {f(n): there exist positive constants c1>0, c2 >0 and n0 >0 such that  c g(n)  f(n)  c g(n)  n  n }
  • 41. 41 Q notation (Theta) (Tight Bound)  We say g(n) is an asymptotic tight bound for f(n):  Theta notation  (g(n)) means that as n  , the execution time f(n) is at most c2.g(n) and at least c1.g(n) for some constants c1 and c2.  f(n) = Q(g(n)) if and only if  f(n) = O(g(n)) & f(n) = W(g(n))   )( )( lim0 ng nf n
  • 42. 42 Q notation (Theta) (Tight Bound) time nn0 c1.g(n) f(n) f(n) = Q(g(n)) c2.g(n)
  • 43. 43 Q notation (Theta) (Tight Bound)  Example: n2/2 − 2n = Q(n2), with c1 = 1/4, c2 = 1/2, and n0 = 8.
  • 44. 44 o-notation  For a given function g(n), we denote by o(g(n)) the set of functions: o(g(n)) = {f(n): for any positive constant c > 0, there exists a constant n0 > 0 such that 0  f(n) < cg(n) for all n  n0 }  f(n) becomes insignificant relative to g(n) as n approaches infinity: lim [f(n) / g(n)] = 0n  We say g(n) is an upper bound for f(n) that is not asymptotically tight.
  • 45. 45 O(*) versus o(*) O(g(n)) = {f(n): there exist positive constants c and n0 such that 0  f(n)  cg(n), for all n  n0 }. o(g(n)) = {f(n): for any positive constant c > 0, there exists a constant n0 > 0 such that 0  f(n) < cg(n) for all n  n0 }. Thus o(f(n)) is a weakened O(f(n)). For example: n2 = O(n2) n2  o(n2) n2 = O(n3) n2 = o(n3)
  • 46. 46 o-notation  n1.9999 = o(n2)  n2/ lg n = o(n2)  n2  o(n2) (just like 2< 2)  n2/1000  o(n2)
  • 47. 47 w-notation  For a given function g(n), we denote by w(g(n)) the set of functions w(g(n)) = {f(n): for any positive constant c > 0, there exists a constant n0 > 0 such that 0  cg(n) < f(n) for all n  n0 }  f(n) becomes arbitrarily large relative to g(n) as n approaches infinity: lim [f(n) / g(n)] =  n  We say g(n) is a lower bound for f(n) that is not asymptotically tight.
  • 48. 48 w-notation  n2.0001 = ω(n2)  n2 lg n = ω(n2)  n2  ω(n2)
  • 49. 49 Comparison of Functions f  g  a  b f (n) = O(g(n))  a  b f (n) = W(g(n))  a  b f (n) = Q(g(n))  a = b f (n) = o(g(n))  a < b f (n) = w (g(n))  a > b
  • 50. Why Does Growth Rate Matter? Complexity 10 20 30 n 0.00001 sec 0.00002 sec 0.00003 sec n2 0.0001 sec 0.0004 sec 0.0009 sec n3 0.001 sec 0.008 sec 0.027 sec n5 0.1 sec 3.2 sec 24.3 sec 2n 0.001 sec 1.0 sec 17.9 min 3n 0.59 sec 58 min 6.5 years
  • 51. Why Does Growth Rate Matter? Complexity 40 50 60 n 0.00004 sec 0.00005 sec 0.00006 sec n2 0.016 sec 0.025 sec 0.036 sec n3 0.064 sec 0.125 sec 0.216 sec n5 1.7 min 5.2 min 13.0 min 2n 12.7 days 35.7 years 366 cent 3n 3855 cent 2 x 108 cent 1.3 x 1013 cent