SlideShare a Scribd company logo
Greedy Algorithms
Greedy Algorithms
1
Chapter 3
Chapter 3
Overview
• Greedy algorithms are used to solve optimization
problems.
– For most optimization problems you want to find, not
just a solution, but the best solution.
• A greedy algorithm sometimes works well for
optimization problems. It works in phases.
– At each phase:
• You take the best you can get right now, without
regard for future consequences.
• You hope that by choosing a local optimum at each
step, you will end up at a global optimum.
…
• Problems exhibit optimal substructure.
– The optimal solution for the problem can be gained from
the optimal solutions of its sub problems.
• Problems also exhibit the greedy-choice property.
– When we have a choice to make, make the one that looks
best right now.
– Make a locally optimal choice in hope of getting a
globally optimal solution.
3
Greedy Strategy
• The choice that seems best at the moment is the one
we go with.
– Prove that when there is a choice to make, one of the
optimal choices is the greedy choice.
• Therefore, it’s always safe to make the greedy choice.
– Show that all but one of the sub-problems resulting from
the greedy choice are empty.
Minimum Spanning Trees (MST)
6
Minimum Spanning Trees
• Spanning Tree
– A tree (i.e., connected, acyclic graph) which contains
all the vertices of the graph
• Minimum Spanning Tree
– Spanning tree with the minimum sum of weights
• Spanning forest
– If a graph is not connected, then there is a spanning
tree for each connected component of the graph
a
b c d
e
g g f
i
4
8 7
8
11
1 2
7
2
4 14
9
10
6
7
Applications of MST
– Find the least expensive way to connect a set of
cities, terminals, computers, etc.
8
How to build MST?
Idea: build the MST edge by edge.
Start from A = , By definition A is a (trivial) subset of a MST
∅
Add edge to A, maintaining the property that A is a subset of
some MST
Stop when no edge can be added to A anymore. At this point A
will be in MST
9
Example
Problem
• A town has a set of houses
and a set of roads
• A road connects 2 and only
2 houses
• A road connecting houses u and v has a repair
cost w(u, v)
Goal: Repair enough (and no more) roads such that:
1. Everyone stays connected
i.e., can reach every house from all other houses
2. Total repair cost is minimum
a
b c d
e
h g f
i
4
8 7
8
11
1 2
7
2
4 14
9
10
6
10
Minimum Spanning Trees
• A connected, undirected graph:
– Vertices = houses, Edges = roads
• A weight w(u, v) on each edge (u, v)  E
a
b c d
e
h g f
i
4
8 7
8
11
1 2
7
2
4 14
9
10
6
Find T  E such that:
1. T connects all vertices
2. w(T) = Σ(u,v)T w(u, v) is
minimized
11
Properties of Minimum Spanning Trees
• Minimum spanning tree is not unique
• MST has no cycles
– We can take out an edge of a cycle, and still have
the vertices connected while reducing the cost
• All of edges in a MST:
– |V| - 1
12
Prim’s Algorithm
• The edges in set A always form a single tree
• Starts from an arbitrary root: VA = {a}
• At each step:
– Find a light edge crossing (VA, V - VA)
– Add this edge to A
– Repeat until the tree spans all vertices
a
b c d
e
h g f
i
4
8 7
8
11
1 2
7
2
4 14
9
10
6
13
How to Find Light Edges Quickly?
Use a priority queue Q:
• Contains vertices not yet
included in the tree, i.e., (V – VA)
– VA = {a}, Q = {b, c, d, e, f, g, h, i}
• We associate a key with each vertex v:
key[v] = minimum weight of any edge (u, v)
connecting v to VA
a
b c d
e
h g f
i
4
8 7
8
11
1 2
7
2
4 14
9
10
6
w1
w2
Key[a] = min(w1,w2)
a
14
…
• After adding a new node to VA we update the weights of
all the nodes adjacent to it
e.g., after adding a to the tree, k[b]=4 and k[h]=8
• Key of v is  if v is not adjacent to any vertices in VA
a
b c d
e
h g f
i
4
8 7
8
11
1 2
7
2
4 14
9
10
6
15
Example
0        
Q = {a, b, c, d, e, f, g, h, i}
VA = 
Extract-MIN(Q)  a
a
b c d
e
h g f
i
4
8 7
8
11
1 2
7
2
4 14
9
10
6
a
b c d
e
h g f
i
4
8 7
8
11
1 2
7
2
4 14
9
10
6
key [b] = 4  [b] = a
key [h] = 8  [h] = a
4      8 
Q = {b, c, d, e, f, g, h, i} VA = {a}
Extract-MIN(Q)  b
  
 
  
 
 
 

4

8
16
4 

8  
8

Example
a
b c d
e
h g f
i
4
8 7
8
11
1 2
7
2
4 14
9
10
6
key [c] = 8  [c] = b
key [h] = 8  [h] = a -
unchanged
8     8 
Q = {c, d, e, f, g, h, i} VA = {a, b}
Extract-MIN(Q)  c
a
b c d
e
h g f
i
4
8 7
8
11
1 2
7
2
4 14
9
10
6
key [d] = 7  [d] = c
key [f] = 4  [f] = c
key [i] = 2  [i] = c
7  4  8 2
Q = {d, e, f, g, h, i} VA = {a, b, c}
Extract-MIN(Q)  i


4 

8  
8
7
4
2
17
Example
a
b c d
e
h g f
i
4
8 7
8
11
1 2
7
2
4 14
9
10
6
key [h] = 7  [h] = i
key [g] = 6  [g] = i
7  4 6 8
Q = {d, e, f, g, h} VA = {a, b, c, i}
Extract-MIN(Q)  f
a
b c d
e
h g f
i
4
8 7
8
11
1 2
7
2
4 14
9
10
6
key [g] = 2  [g] = f
key [d] = 7  [d] = c
unchanged
key [e] = 10  [e] = f
7 10 2 8
Q = {d, e, g, h} VA = {a, b, c, i, f}
Extract-MIN(Q)  g
4 7

8  4
8
2
7 6
4 7

7 6 4
8
2
2
10
18
Example
a
b c d
e
h g f
i
4
8 7
8
11
1 2
7
2
4 14
9
10
6
key [h] = 1  [h] = g
7 10 1
Q = {d, e, h} VA = {a, b, c, i, f, g}
Extract-MIN(Q)  h
a
b c d
e
h g f
i
4
8 7
8
11
1 2
7
2
4 14
9
10
6
7 10
Q = {d, e} VA = {a, b, c, i, f, g, h}
Extract-MIN(Q)  d
4 7
10
7 2 4
8
2
1
4 7
10
1 2 4
8
2
19
Example
a
b c d
e
h g f
i
4
8 7
8
11
1 2
7
2
4 14
9
10
6
key [e] = 9  [e] = f
9
Q = {e} VA = {a, b, c, i, f, g, h, d}
Extract-MIN(Q)  e
– Q = 
– VA = {a, b, c, i, f, g, h, d, e}
4 7
10
1 2 4
8
2 9
20
PRIM(V, E, w, r)
1. Q ← 
2. for each u  V
3. do key[u] ← ∞
4. π[u] ← NIL
5. INSERT(Q, u)
6. DECREASE-KEY(Q, r, 0) ► key[r] 0
←
7. while Q  
8. do u ← EXTRACT-MIN(Q)
9. for each v  Adj[u]
10. do if v  Q and w(u, v) < key[v]
11. then π[v] u
←
12. DECREASE-KEY(Q, v, w(u, v))
O(V) if Q is implemented
as a min-heap
Executed |V| times
Takes O(logV)
Min-heap
operations:
O(VlogV)
Executed O(E) times total
Constant
Takes O(logV)
O(ElogV)
Total time: O(VlogV + ElogV) = O(ElogV)
O(logV)
21
Kruskal’s Algorithm
• How is it different from Prim’s algorithm?
– Prim’s algorithm grows one tree all the
time
– Kruskal’s algorithm grows multiple trees
(i.e., a forest) at the same time.
– Trees are merged together using safe edges
– Since an MST has exactly |V| - 1 edges,
after |V| - 1 merges, we would have only
one component
u
v
tree1
tree2
22
We would add
edge (c, f)
a
b c d
e
h g f
i
4
8 7
8
11
1 2
7
2
4 14
9
10
6
…
• Start with each vertex being its own
component
• Repeatedly merge two components into
one by choosing the light edge that
connects them
• Which components to consider at each
iteration?
– Scan the set of edges in monotonically
increasing order by weight
23
Example
1. Add (h, g)
2. Add (c, i)
3. Add (g, f)
4. Add (a, b)
5. Add (c, f)
6. Ignore (i, g)
7. Add (c, d)
8. Ignore (i, h)
9. Add (a, h)
10. Ignore (b, c)
11. Add (d, e)
12. Ignore (e, f)
13. Ignore (b, h)
14. Ignore (d, f)
a
b c d
e
h g f
i
4
8 7
8
11
1 2
7
2
4 14
9
10
6
1: (h, g)
2: (c, i), (g, f)
4: (a, b), (c, f)
6: (i, g)
7: (c, d), (i, h)
8: (a, h), (b, c)
9: (d, e)
10: (e, f)
11: (b, h)
14: (d, f)
{g, h}, {a}, {b}, {c}, {d}, {e}, {f}, {i}
{g, h}, {c, i}, {a}, {b}, {d}, {e}, {f}
{g, h, f}, {c, i}, {a}, {b}, {d}, {e}
{g, h, f}, {c, i}, {a, b}, {d}, {e}
{g, h, f, c, i}, {a, b}, {d}, {e}
{g, h, f, c, i}, {a, b}, {d}, {e}
{g, h, f, c, i, d}, {a, b}, {e}
{g, h, f, c, i, d}, {a, b}, {e}
{g, h, f, c, i, d, a, b}, {e}
{g, h, f, c, i, d, a, b}, {e}
{g, h, f, c, i, d, a, b, e}
{g, h, f, c, i, d, a, b, e}
{g, h, f, c, i, d, a, b, e}
{g, h, f, c, i, d, a, b, e}
{a}, {b}, {c}, {d}, {e}, {f}, {g}, {h}, {i}
24
1. A ← 
2. for each vertex v  V
3. do MAKE-SET(v)
4. sort E into non-decreasing order by w
5. for each (u, v) taken from the sorted list
6. do if FIND-SET(u)  FIND-SET(v)
7. then A ← A  {(u, v)}
8. UNION(u, v)
9. return A
- Running time: O(V+ElgE+ElgV)=O(ElgE)
- Since E=O(V2
), we have lgE=O(2lgV)=O(lgV)
KRUSKAL(V, E, w)
O(V)
O(ElgE)
O(E)
O(lgV)
O(ElgV)
Shortest Paths
26
Shortest Path Problems
• How can we find the shortest route between two
points on a road map?
• Model the problem as a graph problem:
– Road map is a weighted graph:
vertices = cities
edges = road segments between cities
edge weights = road distances
– Goal: find a shortest path between two vertices (cities)
27
…
• Input:
– Directed graph G = (V, E)
– Weight function w : E → R
• Weight of path p = v0, v1, . . . , vk
• Shortest-path weight from u to v:
δ(u, v) = min w(p) : u v if there exists a path from u to v
∞ otherwise
• Note: there might be multiple shortest paths from u to v




k
i
i
i v
v
w
p
w
1
1 )
,
(
)
(
p
0
3 9
5 11
3
6
5
7
6
s
t x
y z
2
2 1
4
3
28
Variants of Shortest Path
• Single-source shortest paths
– G = (V, E)  find a shortest path from a given source
vertex s to each vertex v  V
• Single-destination shortest paths
– Find a shortest path to a given destination vertex t
from each vertex v
– Reversing the direction of each edge  single-source
29
…
• Single-pair shortest path
– Find a shortest path from u to v for given vertices u
and v
• All-pairs shortest-paths
– Find a shortest path from u to v for every pair of
vertices u and v
30
…
algorthm analysis from computer scince.ppt
algorthm analysis from computer scince.ppt
algorthm analysis from computer scince.ppt
algorthm analysis from computer scince.ppt
algorthm analysis from computer scince.ppt
algorthm analysis from computer scince.ppt
algorthm analysis from computer scince.ppt
algorthm analysis from computer scince.ppt
algorthm analysis from computer scince.ppt
algorthm analysis from computer scince.ppt
algorthm analysis from computer scince.ppt
algorthm analysis from computer scince.ppt
algorthm analysis from computer scince.ppt
algorthm analysis from computer scince.ppt
algorthm analysis from computer scince.ppt
algorthm analysis from computer scince.ppt
47
Scheduling
Scheduling
Scheduling criteria
• CPU utilization- keep the CPU as busy as possible (from 0%
to100%
• Throughput- number of processes that complete their
execution per time unit
• Turnaround time - amount of time to execute a particular
Process
• Waiting time - amount of time a process has been waiting in
the ready queue
• Response time - amount of time it takes from when a request
was submitted until the first response is produced
48
Optimization criteria
» max CPU utilization
» max throughput
» Min turnaround time
» Min waiting time
» Min Response time
49
Scheduling Algorithm
•First Come First Serve Scheduling
•Shortest job First Scheduling
•Priority Scheduling
•Round-Robin Scheduling
50
First Come First Serve Scheduling (FCFS)
51
Process Burst time
P1 24
P2 3
P2 3
• Suppose that the processes arrive in the order: P1, P2, P3
• The Gantt Chart for the schedule is:
…
• The average of waiting time in this policy is usually quite long
• Waiting time for P1= 0, P2 = 24, P3 = 27
• Average waiting time = (0+24+27)/3 = 17
• Suppose we change the order of arriving job P2, P3, P1
• The Gantt chart for the schedule is:
52
Waiting time for P1= 6, P2 = 0, P3 = 3
Average waiting time: (6 + 0 + 3)/3 = 3
•Consider if we have a CPU-bound process and many I/0-bound processes
•There is a convoy effect as all the other processes waiting for one of the big
process to get off the CPU
•FCFS scheduling algorithm is non-preemptive
Short job first scheduling (SJF)
• This algorithm associates with each process the length of the
processes next CPU burst
• If there is a tie, FCFS is used
• In other words, this algorithm can be also regard as shortest-
next-CPU-burst algorithm
• SJF is optimal - gives minimum average waiting time for a
given set of processes
53
Processes Burst time
P1 6
P2 8
P3 7
P4 3
FCFS average waiting time:
• P1+P2+P3+P4
• (0+6+14+21)/4 = 10.25
SJF average waiting time:
• P1+P2+P3+P4
• (3+16+9+0)/4 = 7
…
Two schemes:
– Non-preemptive - once CPU given to the process it cannot
be preempted until completes its CPU burst
54
…
55
• Preemptive - if a new process arrives with CPU burst length less
than remaining time of current executing process, preempt.
• This scheme is know as the Shortest-Remaining-Time-First
(SRTF)
Priority Scheduling
• A priority number (integer) is associated with each process.
The CPU is allocated to the process with the highest priority
• (smallest integer = highest priority)
– Preemptive
– Non-preemptive
• SJF is a special priority scheduling where priority is the
predicted next CPU burst time, so that it can decide the priority
56
Processes Burst time Priorit
y
Arrival
time
P1 10 3 0.0
P2 1 1 1.0
P3 2 4 2.0
P4 1 5 3.0
P5 5 2 4.0
The average waiting time
= (6+0+16+18+1}/5 = 8.2
…
• Problem : Starvation - how priority processes may never
execute
• Solution : Aging as time progresses increase the priority of
the process
57
Round-Robin Scheduling
• The Round-Robin is designed especially for time sharing
systems.
• It is similar FCFS but add preempted concept
• A small unit of time, called time quantum, is defined
• Each process gets a small unit of CPU time (time quantum)
usually 10-100 milliseconds. After this time has elapsed, the
process is preempted and added to the end of the ready queue.
58
…
59
…
• If there are processes in the ready queue and the time quantum
is q, then each process gets 1/n of the CPU time in chunks of at
most q time units at once. No process waits more than (n-1)q
time units.
• Performance
• q large => FIFO
• q small => q must be large with respect to context switch,
otherwise overhead is too high
• Typically, higher average turnaround than SJF, but better
response
60
…
61
Reading Assignment
Reading Assignment
62
Multilevel Queue Scheduling
Multilevel Feedback-Queue Scheduling

More Related Content

PDF
Minimum spanning tree
PPTX
Algorithm analysis Greedy method in algorithm.pptx
PPT
Unit 5 session 2 MinimumSpanningTrees.ppt
PDF
Algorithm chapter 9
PPTX
Greedy Strategy.pptxbfasjbjfn asnfn anjn
PPT
Greedy Approach in Design Analysis and Algorithms
PPTX
Greedy technique - Algorithm design techniques using data structures
PDF
DATA STRUCTURES & ALGORITHMS MINIMUM SPANNING TREE
Minimum spanning tree
Algorithm analysis Greedy method in algorithm.pptx
Unit 5 session 2 MinimumSpanningTrees.ppt
Algorithm chapter 9
Greedy Strategy.pptxbfasjbjfn asnfn anjn
Greedy Approach in Design Analysis and Algorithms
Greedy technique - Algorithm design techniques using data structures
DATA STRUCTURES & ALGORITHMS MINIMUM SPANNING TREE

Similar to algorthm analysis from computer scince.ppt (20)

PPT
3 Greedy-lec.pptggggghhhhhhhyyyyyyyyyyyyyy
PPT
Prim Algorithm and kruskal algorithm
PPTX
Prims & kruskal algorithms
PDF
prims and Kruskal 1.pdf
PPT
Greedy Algorithms Chapter for new students 4.ppt
PPTX
lec6.pptx
PPT
19 primkruskal
PPTX
A presentation on prim's and kruskal's algorithm
PDF
Skiena algorithm 2007 lecture13 minimum spanning trees
PPTX
Greedy algorithms -Making change-Knapsack-Prim's-Kruskal's
PPTX
APznzaZLM_MVouyxM4cxHPJR5BC-TAxTWqhQJ2EywQQuXStxJTDoGkHdsKEQGd4Vo7BS3Q1npCOMV...
PPTX
DM Min SPan Tree Minimum spanning tree .pptx
PPT
Farhana shaikh webinar_spanning tree
PDF
Unit3_1.pdf
PPTX
8_MST_pptx.pptx
PPT
lecture 16
PPT
minimum spanning trees Algorithm
PDF
Daa chapter13
PPTX
GRAPH APPLICATION - MINIMUM SPANNING TREE (MST)
PPT
Weighted graphs
3 Greedy-lec.pptggggghhhhhhhyyyyyyyyyyyyyy
Prim Algorithm and kruskal algorithm
Prims & kruskal algorithms
prims and Kruskal 1.pdf
Greedy Algorithms Chapter for new students 4.ppt
lec6.pptx
19 primkruskal
A presentation on prim's and kruskal's algorithm
Skiena algorithm 2007 lecture13 minimum spanning trees
Greedy algorithms -Making change-Knapsack-Prim's-Kruskal's
APznzaZLM_MVouyxM4cxHPJR5BC-TAxTWqhQJ2EywQQuXStxJTDoGkHdsKEQGd4Vo7BS3Q1npCOMV...
DM Min SPan Tree Minimum spanning tree .pptx
Farhana shaikh webinar_spanning tree
Unit3_1.pdf
8_MST_pptx.pptx
lecture 16
minimum spanning trees Algorithm
Daa chapter13
GRAPH APPLICATION - MINIMUM SPANNING TREE (MST)
Weighted graphs
Ad

Recently uploaded (20)

PDF
annual-report-2024-2025 original latest.
PPTX
Introduction-to-Cloud-ComputingFinal.pptx
PPTX
1_Introduction to advance data techniques.pptx
PPTX
Supervised vs unsupervised machine learning algorithms
PPTX
Microsoft-Fabric-Unifying-Analytics-for-the-Modern-Enterprise Solution.pptx
PDF
.pdf is not working space design for the following data for the following dat...
PPTX
Introduction to Knowledge Engineering Part 1
PPTX
AI Strategy room jwfjksfksfjsjsjsjsjfsjfsj
PDF
BF and FI - Blockchain, fintech and Financial Innovation Lesson 2.pdf
PPTX
iec ppt-1 pptx icmr ppt on rehabilitation.pptx
PDF
Lecture1 pattern recognition............
PPTX
Qualitative Qantitative and Mixed Methods.pptx
PPT
Reliability_Chapter_ presentation 1221.5784
PPTX
DISORDERS OF THE LIVER, GALLBLADDER AND PANCREASE (1).pptx
PPTX
Business Ppt On Nestle.pptx huunnnhhgfvu
PDF
Business Analytics and business intelligence.pdf
PPTX
Acceptance and paychological effects of mandatory extra coach I classes.pptx
PPTX
STUDY DESIGN details- Lt Col Maksud (21).pptx
PPTX
Introduction to Basics of Ethical Hacking and Penetration Testing -Unit No. 1...
PDF
Clinical guidelines as a resource for EBP(1).pdf
annual-report-2024-2025 original latest.
Introduction-to-Cloud-ComputingFinal.pptx
1_Introduction to advance data techniques.pptx
Supervised vs unsupervised machine learning algorithms
Microsoft-Fabric-Unifying-Analytics-for-the-Modern-Enterprise Solution.pptx
.pdf is not working space design for the following data for the following dat...
Introduction to Knowledge Engineering Part 1
AI Strategy room jwfjksfksfjsjsjsjsjfsjfsj
BF and FI - Blockchain, fintech and Financial Innovation Lesson 2.pdf
iec ppt-1 pptx icmr ppt on rehabilitation.pptx
Lecture1 pattern recognition............
Qualitative Qantitative and Mixed Methods.pptx
Reliability_Chapter_ presentation 1221.5784
DISORDERS OF THE LIVER, GALLBLADDER AND PANCREASE (1).pptx
Business Ppt On Nestle.pptx huunnnhhgfvu
Business Analytics and business intelligence.pdf
Acceptance and paychological effects of mandatory extra coach I classes.pptx
STUDY DESIGN details- Lt Col Maksud (21).pptx
Introduction to Basics of Ethical Hacking and Penetration Testing -Unit No. 1...
Clinical guidelines as a resource for EBP(1).pdf
Ad

algorthm analysis from computer scince.ppt

  • 2. Overview • Greedy algorithms are used to solve optimization problems. – For most optimization problems you want to find, not just a solution, but the best solution. • A greedy algorithm sometimes works well for optimization problems. It works in phases. – At each phase: • You take the best you can get right now, without regard for future consequences. • You hope that by choosing a local optimum at each step, you will end up at a global optimum.
  • 3. … • Problems exhibit optimal substructure. – The optimal solution for the problem can be gained from the optimal solutions of its sub problems. • Problems also exhibit the greedy-choice property. – When we have a choice to make, make the one that looks best right now. – Make a locally optimal choice in hope of getting a globally optimal solution. 3
  • 4. Greedy Strategy • The choice that seems best at the moment is the one we go with. – Prove that when there is a choice to make, one of the optimal choices is the greedy choice. • Therefore, it’s always safe to make the greedy choice. – Show that all but one of the sub-problems resulting from the greedy choice are empty.
  • 6. 6 Minimum Spanning Trees • Spanning Tree – A tree (i.e., connected, acyclic graph) which contains all the vertices of the graph • Minimum Spanning Tree – Spanning tree with the minimum sum of weights • Spanning forest – If a graph is not connected, then there is a spanning tree for each connected component of the graph a b c d e g g f i 4 8 7 8 11 1 2 7 2 4 14 9 10 6
  • 7. 7 Applications of MST – Find the least expensive way to connect a set of cities, terminals, computers, etc.
  • 8. 8 How to build MST? Idea: build the MST edge by edge. Start from A = , By definition A is a (trivial) subset of a MST ∅ Add edge to A, maintaining the property that A is a subset of some MST Stop when no edge can be added to A anymore. At this point A will be in MST
  • 9. 9 Example Problem • A town has a set of houses and a set of roads • A road connects 2 and only 2 houses • A road connecting houses u and v has a repair cost w(u, v) Goal: Repair enough (and no more) roads such that: 1. Everyone stays connected i.e., can reach every house from all other houses 2. Total repair cost is minimum a b c d e h g f i 4 8 7 8 11 1 2 7 2 4 14 9 10 6
  • 10. 10 Minimum Spanning Trees • A connected, undirected graph: – Vertices = houses, Edges = roads • A weight w(u, v) on each edge (u, v)  E a b c d e h g f i 4 8 7 8 11 1 2 7 2 4 14 9 10 6 Find T  E such that: 1. T connects all vertices 2. w(T) = Σ(u,v)T w(u, v) is minimized
  • 11. 11 Properties of Minimum Spanning Trees • Minimum spanning tree is not unique • MST has no cycles – We can take out an edge of a cycle, and still have the vertices connected while reducing the cost • All of edges in a MST: – |V| - 1
  • 12. 12 Prim’s Algorithm • The edges in set A always form a single tree • Starts from an arbitrary root: VA = {a} • At each step: – Find a light edge crossing (VA, V - VA) – Add this edge to A – Repeat until the tree spans all vertices a b c d e h g f i 4 8 7 8 11 1 2 7 2 4 14 9 10 6
  • 13. 13 How to Find Light Edges Quickly? Use a priority queue Q: • Contains vertices not yet included in the tree, i.e., (V – VA) – VA = {a}, Q = {b, c, d, e, f, g, h, i} • We associate a key with each vertex v: key[v] = minimum weight of any edge (u, v) connecting v to VA a b c d e h g f i 4 8 7 8 11 1 2 7 2 4 14 9 10 6 w1 w2 Key[a] = min(w1,w2) a
  • 14. 14 … • After adding a new node to VA we update the weights of all the nodes adjacent to it e.g., after adding a to the tree, k[b]=4 and k[h]=8 • Key of v is  if v is not adjacent to any vertices in VA a b c d e h g f i 4 8 7 8 11 1 2 7 2 4 14 9 10 6
  • 15. 15 Example 0         Q = {a, b, c, d, e, f, g, h, i} VA =  Extract-MIN(Q)  a a b c d e h g f i 4 8 7 8 11 1 2 7 2 4 14 9 10 6 a b c d e h g f i 4 8 7 8 11 1 2 7 2 4 14 9 10 6 key [b] = 4  [b] = a key [h] = 8  [h] = a 4      8  Q = {b, c, d, e, f, g, h, i} VA = {a} Extract-MIN(Q)  b                4  8
  • 16. 16 4   8   8  Example a b c d e h g f i 4 8 7 8 11 1 2 7 2 4 14 9 10 6 key [c] = 8  [c] = b key [h] = 8  [h] = a - unchanged 8     8  Q = {c, d, e, f, g, h, i} VA = {a, b} Extract-MIN(Q)  c a b c d e h g f i 4 8 7 8 11 1 2 7 2 4 14 9 10 6 key [d] = 7  [d] = c key [f] = 4  [f] = c key [i] = 2  [i] = c 7  4  8 2 Q = {d, e, f, g, h, i} VA = {a, b, c} Extract-MIN(Q)  i   4   8   8 7 4 2
  • 17. 17 Example a b c d e h g f i 4 8 7 8 11 1 2 7 2 4 14 9 10 6 key [h] = 7  [h] = i key [g] = 6  [g] = i 7  4 6 8 Q = {d, e, f, g, h} VA = {a, b, c, i} Extract-MIN(Q)  f a b c d e h g f i 4 8 7 8 11 1 2 7 2 4 14 9 10 6 key [g] = 2  [g] = f key [d] = 7  [d] = c unchanged key [e] = 10  [e] = f 7 10 2 8 Q = {d, e, g, h} VA = {a, b, c, i, f} Extract-MIN(Q)  g 4 7  8  4 8 2 7 6 4 7  7 6 4 8 2 2 10
  • 18. 18 Example a b c d e h g f i 4 8 7 8 11 1 2 7 2 4 14 9 10 6 key [h] = 1  [h] = g 7 10 1 Q = {d, e, h} VA = {a, b, c, i, f, g} Extract-MIN(Q)  h a b c d e h g f i 4 8 7 8 11 1 2 7 2 4 14 9 10 6 7 10 Q = {d, e} VA = {a, b, c, i, f, g, h} Extract-MIN(Q)  d 4 7 10 7 2 4 8 2 1 4 7 10 1 2 4 8 2
  • 19. 19 Example a b c d e h g f i 4 8 7 8 11 1 2 7 2 4 14 9 10 6 key [e] = 9  [e] = f 9 Q = {e} VA = {a, b, c, i, f, g, h, d} Extract-MIN(Q)  e – Q =  – VA = {a, b, c, i, f, g, h, d, e} 4 7 10 1 2 4 8 2 9
  • 20. 20 PRIM(V, E, w, r) 1. Q ←  2. for each u  V 3. do key[u] ← ∞ 4. π[u] ← NIL 5. INSERT(Q, u) 6. DECREASE-KEY(Q, r, 0) ► key[r] 0 ← 7. while Q   8. do u ← EXTRACT-MIN(Q) 9. for each v  Adj[u] 10. do if v  Q and w(u, v) < key[v] 11. then π[v] u ← 12. DECREASE-KEY(Q, v, w(u, v)) O(V) if Q is implemented as a min-heap Executed |V| times Takes O(logV) Min-heap operations: O(VlogV) Executed O(E) times total Constant Takes O(logV) O(ElogV) Total time: O(VlogV + ElogV) = O(ElogV) O(logV)
  • 21. 21 Kruskal’s Algorithm • How is it different from Prim’s algorithm? – Prim’s algorithm grows one tree all the time – Kruskal’s algorithm grows multiple trees (i.e., a forest) at the same time. – Trees are merged together using safe edges – Since an MST has exactly |V| - 1 edges, after |V| - 1 merges, we would have only one component u v tree1 tree2
  • 22. 22 We would add edge (c, f) a b c d e h g f i 4 8 7 8 11 1 2 7 2 4 14 9 10 6 … • Start with each vertex being its own component • Repeatedly merge two components into one by choosing the light edge that connects them • Which components to consider at each iteration? – Scan the set of edges in monotonically increasing order by weight
  • 23. 23 Example 1. Add (h, g) 2. Add (c, i) 3. Add (g, f) 4. Add (a, b) 5. Add (c, f) 6. Ignore (i, g) 7. Add (c, d) 8. Ignore (i, h) 9. Add (a, h) 10. Ignore (b, c) 11. Add (d, e) 12. Ignore (e, f) 13. Ignore (b, h) 14. Ignore (d, f) a b c d e h g f i 4 8 7 8 11 1 2 7 2 4 14 9 10 6 1: (h, g) 2: (c, i), (g, f) 4: (a, b), (c, f) 6: (i, g) 7: (c, d), (i, h) 8: (a, h), (b, c) 9: (d, e) 10: (e, f) 11: (b, h) 14: (d, f) {g, h}, {a}, {b}, {c}, {d}, {e}, {f}, {i} {g, h}, {c, i}, {a}, {b}, {d}, {e}, {f} {g, h, f}, {c, i}, {a}, {b}, {d}, {e} {g, h, f}, {c, i}, {a, b}, {d}, {e} {g, h, f, c, i}, {a, b}, {d}, {e} {g, h, f, c, i}, {a, b}, {d}, {e} {g, h, f, c, i, d}, {a, b}, {e} {g, h, f, c, i, d}, {a, b}, {e} {g, h, f, c, i, d, a, b}, {e} {g, h, f, c, i, d, a, b}, {e} {g, h, f, c, i, d, a, b, e} {g, h, f, c, i, d, a, b, e} {g, h, f, c, i, d, a, b, e} {g, h, f, c, i, d, a, b, e} {a}, {b}, {c}, {d}, {e}, {f}, {g}, {h}, {i}
  • 24. 24 1. A ←  2. for each vertex v  V 3. do MAKE-SET(v) 4. sort E into non-decreasing order by w 5. for each (u, v) taken from the sorted list 6. do if FIND-SET(u)  FIND-SET(v) 7. then A ← A  {(u, v)} 8. UNION(u, v) 9. return A - Running time: O(V+ElgE+ElgV)=O(ElgE) - Since E=O(V2 ), we have lgE=O(2lgV)=O(lgV) KRUSKAL(V, E, w) O(V) O(ElgE) O(E) O(lgV) O(ElgV)
  • 26. 26 Shortest Path Problems • How can we find the shortest route between two points on a road map? • Model the problem as a graph problem: – Road map is a weighted graph: vertices = cities edges = road segments between cities edge weights = road distances – Goal: find a shortest path between two vertices (cities)
  • 27. 27 … • Input: – Directed graph G = (V, E) – Weight function w : E → R • Weight of path p = v0, v1, . . . , vk • Shortest-path weight from u to v: δ(u, v) = min w(p) : u v if there exists a path from u to v ∞ otherwise • Note: there might be multiple shortest paths from u to v     k i i i v v w p w 1 1 ) , ( ) ( p 0 3 9 5 11 3 6 5 7 6 s t x y z 2 2 1 4 3
  • 28. 28 Variants of Shortest Path • Single-source shortest paths – G = (V, E)  find a shortest path from a given source vertex s to each vertex v  V • Single-destination shortest paths – Find a shortest path to a given destination vertex t from each vertex v – Reversing the direction of each edge  single-source
  • 29. 29 … • Single-pair shortest path – Find a shortest path from u to v for given vertices u and v • All-pairs shortest-paths – Find a shortest path from u to v for every pair of vertices u and v
  • 48. Scheduling criteria • CPU utilization- keep the CPU as busy as possible (from 0% to100% • Throughput- number of processes that complete their execution per time unit • Turnaround time - amount of time to execute a particular Process • Waiting time - amount of time a process has been waiting in the ready queue • Response time - amount of time it takes from when a request was submitted until the first response is produced 48
  • 49. Optimization criteria » max CPU utilization » max throughput » Min turnaround time » Min waiting time » Min Response time 49
  • 50. Scheduling Algorithm •First Come First Serve Scheduling •Shortest job First Scheduling •Priority Scheduling •Round-Robin Scheduling 50
  • 51. First Come First Serve Scheduling (FCFS) 51 Process Burst time P1 24 P2 3 P2 3 • Suppose that the processes arrive in the order: P1, P2, P3 • The Gantt Chart for the schedule is:
  • 52. … • The average of waiting time in this policy is usually quite long • Waiting time for P1= 0, P2 = 24, P3 = 27 • Average waiting time = (0+24+27)/3 = 17 • Suppose we change the order of arriving job P2, P3, P1 • The Gantt chart for the schedule is: 52 Waiting time for P1= 6, P2 = 0, P3 = 3 Average waiting time: (6 + 0 + 3)/3 = 3 •Consider if we have a CPU-bound process and many I/0-bound processes •There is a convoy effect as all the other processes waiting for one of the big process to get off the CPU •FCFS scheduling algorithm is non-preemptive
  • 53. Short job first scheduling (SJF) • This algorithm associates with each process the length of the processes next CPU burst • If there is a tie, FCFS is used • In other words, this algorithm can be also regard as shortest- next-CPU-burst algorithm • SJF is optimal - gives minimum average waiting time for a given set of processes 53 Processes Burst time P1 6 P2 8 P3 7 P4 3 FCFS average waiting time: • P1+P2+P3+P4 • (0+6+14+21)/4 = 10.25 SJF average waiting time: • P1+P2+P3+P4 • (3+16+9+0)/4 = 7
  • 54. … Two schemes: – Non-preemptive - once CPU given to the process it cannot be preempted until completes its CPU burst 54
  • 55. … 55 • Preemptive - if a new process arrives with CPU burst length less than remaining time of current executing process, preempt. • This scheme is know as the Shortest-Remaining-Time-First (SRTF)
  • 56. Priority Scheduling • A priority number (integer) is associated with each process. The CPU is allocated to the process with the highest priority • (smallest integer = highest priority) – Preemptive – Non-preemptive • SJF is a special priority scheduling where priority is the predicted next CPU burst time, so that it can decide the priority 56 Processes Burst time Priorit y Arrival time P1 10 3 0.0 P2 1 1 1.0 P3 2 4 2.0 P4 1 5 3.0 P5 5 2 4.0 The average waiting time = (6+0+16+18+1}/5 = 8.2
  • 57. … • Problem : Starvation - how priority processes may never execute • Solution : Aging as time progresses increase the priority of the process 57
  • 58. Round-Robin Scheduling • The Round-Robin is designed especially for time sharing systems. • It is similar FCFS but add preempted concept • A small unit of time, called time quantum, is defined • Each process gets a small unit of CPU time (time quantum) usually 10-100 milliseconds. After this time has elapsed, the process is preempted and added to the end of the ready queue. 58
  • 60. … • If there are processes in the ready queue and the time quantum is q, then each process gets 1/n of the CPU time in chunks of at most q time units at once. No process waits more than (n-1)q time units. • Performance • q large => FIFO • q small => q must be large with respect to context switch, otherwise overhead is too high • Typically, higher average turnaround than SJF, but better response 60
  • 62. Reading Assignment Reading Assignment 62 Multilevel Queue Scheduling Multilevel Feedback-Queue Scheduling