SlideShare a Scribd company logo
Decision Trees
Jeff Storey
Overview
 What is a Decision Tree
 Sample Decision Trees
 How to Construct a Decision Tree
 Problems with Decision Trees
 Decision Trees in Gaming
 Summary
 An inductive learning task
 Use particular facts to make more generalized
conclusions
 A predictive model based on a branching
series of Boolean tests
 These smaller Boolean tests are less complex
than a one-stage classifier
 Let’s look at a sample decision tree…
What is a Decision Tree?
Predicting Commute Time
Leave At
Stall? Accident?
10 AM 9 AM
8 AM
Long
Long
Short Medium Long
No Yes No Yes
If we leave at
10 AM and
there are no
cars stalled on
the road, what
will our
commute time
be?
Inductive Learning
 In this decision tree, we made a series of
Boolean decisions and followed the
corresponding branch
 Did we leave at 10 AM?
 Did a car stall on the road?
 Is there an accident on the road?
 By answering each of these yes/no
questions, we then came to a conclusion on
how long our commute might take
Decision Trees as Rules
 We did not have represent this tree
graphically
 We could have represented as a set of
rules. However, this may be much
harder to read…
Decision Tree as a Rule Set
if hour == 8am
commute time = long
else if hour == 9am
if accident == yes
commute time = long
else
commute time = medium
else if hour == 10am
if stall == yes
commute time = long
else
commute time = short
 Notice that all attributes to
not have to be used in each
path of the decision.
 As we will see, all attributes
may not even appear in the
tree.
How to Create a Decision Tree
 We first make a list of attributes that we
can measure
 These attributes (for now) must be discrete
 We then choose a target attribute that
we want to predict
 Then create an experience table that
lists what we have seen in the past
Sample Experience Table
Example Attributes Target
Hour Weather Accident Stall Commute
D1 8 AM Sunny No No Long
D2 8 AM Cloudy No Yes Long
D3 10 AM Sunny No No Short
D4 9 AM Rainy Yes No Long
D5 9 AM Sunny Yes Yes Long
D6 10 AM Sunny No No Short
D7 10 AM Cloudy No No Short
D8 9 AM Rainy No No Medium
D9 9 AM Sunny Yes No Long
D10 10 AM Cloudy Yes Yes Long
D11 10 AM Rainy No No Short
D12 8 AM Cloudy Yes No Long
D13 9 AM Sunny No No Medium
Choosing Attributes
 The previous experience decision
table showed 4 attributes: hour,
weather, accident and stall
 But the decision tree only showed 3
attributes: hour, accident and stall
 Why is that?
Choosing Attributes
 Methods for selecting attributes (which
will be described later) show that
weather is not a discriminating
attribute
 We use the principle of Occam’s
Razor: Given a number of competing
hypotheses, the simplest one is
preferable
Choosing Attributes
 The basic structure of creating a
decision tree is the same for most
decision tree algorithms
 The difference lies in how we select
the attributes for the tree
 We will focus on the ID3 algorithm
developed by Ross Quinlan in 1975
Decision Tree Algorithms
 The basic idea behind any decision tree
algorithm is as follows:
 Choose the best attribute(s) to split the
remaining instances and make that attribute a
decision node
 Repeat this process for recursively for each child
 Stop when:
 All the instances have the same target attribute value
 There are no more attributes
 There are no more instances
Identifying the Best Attributes
 Refer back to our original decision tree
Leave At
Stall? Accident?
10 AM 9 AM
8 AM
Long
Long
Short Medium
No Yes No Yes
Long
 How did we know to split on leave at and then on stall and accident and not
weather?
ID3 Heuristic
 To determine the best attribute, we
look at the ID3 heuristic
 ID3 splits attributes based on their
entropy.
 Entropy is the measure of
disinformation…
Entropy
 Entropy is minimized when all values of the
target attribute are the same.
 If we know that commute time will always be
short, then entropy = 0
 Entropy is maximized when there is an
equal chance of all values for the target
attribute (i.e. the result is random)
 If commute time = short in 3 instances, medium
in 3 instances and long in 3 instances, entropy is
maximized
Entropy
 Calculation of entropy
 Entropy(S) = ∑(i=1 to l)-|Si|/|S| * log2(|Si|/|S|)
 S = set of examples
 Si = subset of S with value vi under the target
attribute
 l = size of the range of the target attribute
ID3
 ID3 splits on attributes with the lowest
entropy
 We calculate the entropy for all values of an
attribute as the weighted sum of subset
entropies as follows:
 ∑(i = 1 to k) |Si|/|S| Entropy(Si), where k is the range
of the attribute we are testing
 We can also measure information gain
(which is inversely proportional to entropy)
as follows:
 Entropy(S) - ∑(i = 1 to k) |Si|/|S| Entropy(Si)
ID3
 Given our commute time sample set, we
can calculate the entropy of each attribute
at the root node
Attribute Expected Entropy Information Gain
Hour 0.6511 0.768449
Weather 1.28884 0.130719
Accident 0.92307 0.496479
Stall 1.17071 0.248842
Pruning Trees
 There is another technique for
reducing the number of attributes used
in a tree - pruning
 Two types of pruning:
 Pre-pruning (forward pruning)
 Post-pruning (backward pruning)
Prepruning
 In prepruning, we decide during the building
process when to stop adding attributes
(possibly based on their information gain)
 However, this may be problematic – Why?
 Sometimes attributes individually do not
contribute much to a decision, but combined,
they may have a significant impact
Postpruning
 Postpruning waits until the full decision
tree has built and then prunes the
attributes
 Two techniques:
 Subtree Replacement
 Subtree Raising
Subtree Replacement
 Entire subtree is replaced by a single
leaf node
A
B
C
1 2 3
4 5
Subtree Replacement
 Node 6 replaced the subtree
 Generalizes tree a little more, but may increase
accuracy
A
B
6 4 5
Subtree Raising
 Entire subtree is raised onto another
node
A
B
C
1 2 3
4 5
Subtree Raising
 Entire subtree is raised onto another node
 This was not discussed in detail as it is not
clear whether this is really worthwhile (as it
is very time consuming)
A
C
1 2 3
Problems with ID3
 ID3 is not optimal
 Uses expected entropy reduction, not
actual reduction
 Must use discrete (or discretized)
attributes
 What if we left for work at 9:30 AM?
 We could break down the attributes into
smaller values…
Problems with Decision Trees
 While decision trees classify quickly,
the time for building a tree may be
higher than another type of classifier
 Decision trees suffer from a problem of
errors propagating throughout a tree
 A very serious problem as the number of
classes increases
Error Propagation
 Since decision trees work by a series
of local decisions, what happens when
one of these local decisions is wrong?
 Every decision from that point on may be
wrong
 We may never return to the correct path
of the tree
Error Propagation Example
Problems with ID3
 If we broke down leave time to the
minute, we might get something like
this:
8:02 AM 10:02 AM
8:03 AM 9:09 AM
9:05 AM 9:07 AM
Long Medium Short Long Long Short
Since entropy is very low for each branch, we have
n branches with n leaves. This would not be helpful
for predictive modeling.
Problems with ID3
 We can use a technique known as
discretization
 We choose cut points, such as 9AM for
splitting continuous attributes
 These cut points generally lie in a subset of
boundary points, such that a boundary point
is where two adjacent instances in a sorted
list have different target value attributes
Problems with ID3
 Consider the attribute commute time
8:00 (L), 8:02 (L), 8:07 (M), 9:00 (S), 9:20 (S), 9:25 (S), 10:00 (S), 10:02 (M)
When we split on these attributes, we
increase the entropy so we don’t have a
decision tree with the same number of
cut points as leaves
ID3 in Gaming
 Black & White, developed by Lionhead
Studios, and released in 2001 used
ID3
 Used to predict a player’s reaction to a
certain creature’s action
 In this model, a greater feedback value
means the creature should attack
ID3 in Black & White
Example Attributes Target
Allegiance Defense Tribe Feedback
D1 Friendly Weak Celtic -1.0
D2 Enemy Weak Celtic 0.4
D3 Friendly Strong Norse -1.0
D4 Enemy Strong Norse -0.2
D5 Friendly Weak Greek -1.0
D6 Enemy Medium Greek 0.2
D7 Enemy Strong Greek -0.4
D8 Enemy Medium Aztec 0.0
D9 Friendly Weak Aztec -1.0
ID3 in Black & White
Allegiance
Defense
Friendly Enemy
0.4 -0.3
-1.0
Weak Strong
0.1
Medium
Note that this decision tree does not even use the tribe attribute
ID3 in Black & White
 Now suppose we don’t want the entire
decision tree, but we just want the 2
highest feedback values
 We can create a Boolean expressions,
such as
((Allegiance = Enemy) ^ (Defense = Weak))
v ((Allegiance = Enemy) ^ (Defense =
Medium))
Summary
 Decision trees can be used to help
predict the future
 The trees are easy to understand
 Decision trees work more efficiently
with discrete attributes
 The trees may suffer from error
propagation

More Related Content

PPTX
BAS 250 Lecture 5
PPTX
Decision trees
PPTX
Decision Tree Learning: Decision tree representation, Appropriate problems fo...
PPTX
module_3_1.pptx
PPTX
module_3_1.pptx
PPTX
Decision Trees Learning in Machine Learning
PPTX
Machine Learning, Decision Tree Learning module_2_ppt.pptx
PDF
Research scholars evaluation based on guides view using id3
BAS 250 Lecture 5
Decision trees
Decision Tree Learning: Decision tree representation, Appropriate problems fo...
module_3_1.pptx
module_3_1.pptx
Decision Trees Learning in Machine Learning
Machine Learning, Decision Tree Learning module_2_ppt.pptx
Research scholars evaluation based on guides view using id3

Similar to Decision tree Using Machine Learning.ppt (20)

PDF
Research scholars evaluation based on guides view
PPTX
Decision tree presentation
PDF
Implementation of Improved ID3 Algorithm to Obtain more Optimal Decision Tree.
PDF
Machine Learning using python module_2_ppt.pdf
PDF
Decision trees
PDF
Decision treeDecision treeDecision treeDecision tree
PPTX
Lect9 Decision tree
PPTX
Decision Tree - ID3
PPTX
Chapter 3 Decision Trees.pptx by mark magumba
PPT
Decision Trees.ppt
PDF
CSA 3702 machine learning module 2
PDF
Decision tree lecture 3
PPTX
DecisionTree.pptx for btech cse student
PPTX
ML_Unit_1_Part_C
PDF
7 decision tree
PPTX
Decision tree
DOCX
Classification Using Decision Trees and RulesChapter 5.docx
PPTX
Decision Tree and Random forest
PDF
Decision Tree-ID3,C4.5,CART,Regression Tree
PPTX
DECISION TRESS 2 for machine learning beginners
Research scholars evaluation based on guides view
Decision tree presentation
Implementation of Improved ID3 Algorithm to Obtain more Optimal Decision Tree.
Machine Learning using python module_2_ppt.pdf
Decision trees
Decision treeDecision treeDecision treeDecision tree
Lect9 Decision tree
Decision Tree - ID3
Chapter 3 Decision Trees.pptx by mark magumba
Decision Trees.ppt
CSA 3702 machine learning module 2
Decision tree lecture 3
DecisionTree.pptx for btech cse student
ML_Unit_1_Part_C
7 decision tree
Decision tree
Classification Using Decision Trees and RulesChapter 5.docx
Decision Tree and Random forest
Decision Tree-ID3,C4.5,CART,Regression Tree
DECISION TRESS 2 for machine learning beginners
Ad

Recently uploaded (20)

PPTX
Lecture Notes Electrical Wiring System Components
PPT
Project quality management in manufacturing
PPTX
Internet of Things (IOT) - A guide to understanding
PPTX
additive manufacturing of ss316l using mig welding
PPTX
KTU 2019 -S7-MCN 401 MODULE 2-VINAY.pptx
PDF
Structs to JSON How Go Powers REST APIs.pdf
PDF
BMEC211 - INTRODUCTION TO MECHATRONICS-1.pdf
PDF
Operating System & Kernel Study Guide-1 - converted.pdf
PPTX
UNIT-1 - COAL BASED THERMAL POWER PLANTS
PPTX
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
PDF
Digital Logic Computer Design lecture notes
PPTX
Strings in CPP - Strings in C++ are sequences of characters used to store and...
PPTX
Engineering Ethics, Safety and Environment [Autosaved] (1).pptx
PDF
keyrequirementskkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk
PPTX
Infosys Presentation by1.Riyan Bagwan 2.Samadhan Naiknavare 3.Gaurav Shinde 4...
PDF
Model Code of Practice - Construction Work - 21102022 .pdf
PDF
composite construction of structures.pdf
PPTX
Geodesy 1.pptx...............................................
DOCX
ASol_English-Language-Literature-Set-1-27-02-2023-converted.docx
PPTX
IOT PPTs Week 10 Lecture Material.pptx of NPTEL Smart Cities contd
Lecture Notes Electrical Wiring System Components
Project quality management in manufacturing
Internet of Things (IOT) - A guide to understanding
additive manufacturing of ss316l using mig welding
KTU 2019 -S7-MCN 401 MODULE 2-VINAY.pptx
Structs to JSON How Go Powers REST APIs.pdf
BMEC211 - INTRODUCTION TO MECHATRONICS-1.pdf
Operating System & Kernel Study Guide-1 - converted.pdf
UNIT-1 - COAL BASED THERMAL POWER PLANTS
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
Digital Logic Computer Design lecture notes
Strings in CPP - Strings in C++ are sequences of characters used to store and...
Engineering Ethics, Safety and Environment [Autosaved] (1).pptx
keyrequirementskkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk
Infosys Presentation by1.Riyan Bagwan 2.Samadhan Naiknavare 3.Gaurav Shinde 4...
Model Code of Practice - Construction Work - 21102022 .pdf
composite construction of structures.pdf
Geodesy 1.pptx...............................................
ASol_English-Language-Literature-Set-1-27-02-2023-converted.docx
IOT PPTs Week 10 Lecture Material.pptx of NPTEL Smart Cities contd
Ad

Decision tree Using Machine Learning.ppt

  • 2. Overview  What is a Decision Tree  Sample Decision Trees  How to Construct a Decision Tree  Problems with Decision Trees  Decision Trees in Gaming  Summary
  • 3.  An inductive learning task  Use particular facts to make more generalized conclusions  A predictive model based on a branching series of Boolean tests  These smaller Boolean tests are less complex than a one-stage classifier  Let’s look at a sample decision tree… What is a Decision Tree?
  • 4. Predicting Commute Time Leave At Stall? Accident? 10 AM 9 AM 8 AM Long Long Short Medium Long No Yes No Yes If we leave at 10 AM and there are no cars stalled on the road, what will our commute time be?
  • 5. Inductive Learning  In this decision tree, we made a series of Boolean decisions and followed the corresponding branch  Did we leave at 10 AM?  Did a car stall on the road?  Is there an accident on the road?  By answering each of these yes/no questions, we then came to a conclusion on how long our commute might take
  • 6. Decision Trees as Rules  We did not have represent this tree graphically  We could have represented as a set of rules. However, this may be much harder to read…
  • 7. Decision Tree as a Rule Set if hour == 8am commute time = long else if hour == 9am if accident == yes commute time = long else commute time = medium else if hour == 10am if stall == yes commute time = long else commute time = short  Notice that all attributes to not have to be used in each path of the decision.  As we will see, all attributes may not even appear in the tree.
  • 8. How to Create a Decision Tree  We first make a list of attributes that we can measure  These attributes (for now) must be discrete  We then choose a target attribute that we want to predict  Then create an experience table that lists what we have seen in the past
  • 9. Sample Experience Table Example Attributes Target Hour Weather Accident Stall Commute D1 8 AM Sunny No No Long D2 8 AM Cloudy No Yes Long D3 10 AM Sunny No No Short D4 9 AM Rainy Yes No Long D5 9 AM Sunny Yes Yes Long D6 10 AM Sunny No No Short D7 10 AM Cloudy No No Short D8 9 AM Rainy No No Medium D9 9 AM Sunny Yes No Long D10 10 AM Cloudy Yes Yes Long D11 10 AM Rainy No No Short D12 8 AM Cloudy Yes No Long D13 9 AM Sunny No No Medium
  • 10. Choosing Attributes  The previous experience decision table showed 4 attributes: hour, weather, accident and stall  But the decision tree only showed 3 attributes: hour, accident and stall  Why is that?
  • 11. Choosing Attributes  Methods for selecting attributes (which will be described later) show that weather is not a discriminating attribute  We use the principle of Occam’s Razor: Given a number of competing hypotheses, the simplest one is preferable
  • 12. Choosing Attributes  The basic structure of creating a decision tree is the same for most decision tree algorithms  The difference lies in how we select the attributes for the tree  We will focus on the ID3 algorithm developed by Ross Quinlan in 1975
  • 13. Decision Tree Algorithms  The basic idea behind any decision tree algorithm is as follows:  Choose the best attribute(s) to split the remaining instances and make that attribute a decision node  Repeat this process for recursively for each child  Stop when:  All the instances have the same target attribute value  There are no more attributes  There are no more instances
  • 14. Identifying the Best Attributes  Refer back to our original decision tree Leave At Stall? Accident? 10 AM 9 AM 8 AM Long Long Short Medium No Yes No Yes Long  How did we know to split on leave at and then on stall and accident and not weather?
  • 15. ID3 Heuristic  To determine the best attribute, we look at the ID3 heuristic  ID3 splits attributes based on their entropy.  Entropy is the measure of disinformation…
  • 16. Entropy  Entropy is minimized when all values of the target attribute are the same.  If we know that commute time will always be short, then entropy = 0  Entropy is maximized when there is an equal chance of all values for the target attribute (i.e. the result is random)  If commute time = short in 3 instances, medium in 3 instances and long in 3 instances, entropy is maximized
  • 17. Entropy  Calculation of entropy  Entropy(S) = ∑(i=1 to l)-|Si|/|S| * log2(|Si|/|S|)  S = set of examples  Si = subset of S with value vi under the target attribute  l = size of the range of the target attribute
  • 18. ID3  ID3 splits on attributes with the lowest entropy  We calculate the entropy for all values of an attribute as the weighted sum of subset entropies as follows:  ∑(i = 1 to k) |Si|/|S| Entropy(Si), where k is the range of the attribute we are testing  We can also measure information gain (which is inversely proportional to entropy) as follows:  Entropy(S) - ∑(i = 1 to k) |Si|/|S| Entropy(Si)
  • 19. ID3  Given our commute time sample set, we can calculate the entropy of each attribute at the root node Attribute Expected Entropy Information Gain Hour 0.6511 0.768449 Weather 1.28884 0.130719 Accident 0.92307 0.496479 Stall 1.17071 0.248842
  • 20. Pruning Trees  There is another technique for reducing the number of attributes used in a tree - pruning  Two types of pruning:  Pre-pruning (forward pruning)  Post-pruning (backward pruning)
  • 21. Prepruning  In prepruning, we decide during the building process when to stop adding attributes (possibly based on their information gain)  However, this may be problematic – Why?  Sometimes attributes individually do not contribute much to a decision, but combined, they may have a significant impact
  • 22. Postpruning  Postpruning waits until the full decision tree has built and then prunes the attributes  Two techniques:  Subtree Replacement  Subtree Raising
  • 23. Subtree Replacement  Entire subtree is replaced by a single leaf node A B C 1 2 3 4 5
  • 24. Subtree Replacement  Node 6 replaced the subtree  Generalizes tree a little more, but may increase accuracy A B 6 4 5
  • 25. Subtree Raising  Entire subtree is raised onto another node A B C 1 2 3 4 5
  • 26. Subtree Raising  Entire subtree is raised onto another node  This was not discussed in detail as it is not clear whether this is really worthwhile (as it is very time consuming) A C 1 2 3
  • 27. Problems with ID3  ID3 is not optimal  Uses expected entropy reduction, not actual reduction  Must use discrete (or discretized) attributes  What if we left for work at 9:30 AM?  We could break down the attributes into smaller values…
  • 28. Problems with Decision Trees  While decision trees classify quickly, the time for building a tree may be higher than another type of classifier  Decision trees suffer from a problem of errors propagating throughout a tree  A very serious problem as the number of classes increases
  • 29. Error Propagation  Since decision trees work by a series of local decisions, what happens when one of these local decisions is wrong?  Every decision from that point on may be wrong  We may never return to the correct path of the tree
  • 31. Problems with ID3  If we broke down leave time to the minute, we might get something like this: 8:02 AM 10:02 AM 8:03 AM 9:09 AM 9:05 AM 9:07 AM Long Medium Short Long Long Short Since entropy is very low for each branch, we have n branches with n leaves. This would not be helpful for predictive modeling.
  • 32. Problems with ID3  We can use a technique known as discretization  We choose cut points, such as 9AM for splitting continuous attributes  These cut points generally lie in a subset of boundary points, such that a boundary point is where two adjacent instances in a sorted list have different target value attributes
  • 33. Problems with ID3  Consider the attribute commute time 8:00 (L), 8:02 (L), 8:07 (M), 9:00 (S), 9:20 (S), 9:25 (S), 10:00 (S), 10:02 (M) When we split on these attributes, we increase the entropy so we don’t have a decision tree with the same number of cut points as leaves
  • 34. ID3 in Gaming  Black & White, developed by Lionhead Studios, and released in 2001 used ID3  Used to predict a player’s reaction to a certain creature’s action  In this model, a greater feedback value means the creature should attack
  • 35. ID3 in Black & White Example Attributes Target Allegiance Defense Tribe Feedback D1 Friendly Weak Celtic -1.0 D2 Enemy Weak Celtic 0.4 D3 Friendly Strong Norse -1.0 D4 Enemy Strong Norse -0.2 D5 Friendly Weak Greek -1.0 D6 Enemy Medium Greek 0.2 D7 Enemy Strong Greek -0.4 D8 Enemy Medium Aztec 0.0 D9 Friendly Weak Aztec -1.0
  • 36. ID3 in Black & White Allegiance Defense Friendly Enemy 0.4 -0.3 -1.0 Weak Strong 0.1 Medium Note that this decision tree does not even use the tribe attribute
  • 37. ID3 in Black & White  Now suppose we don’t want the entire decision tree, but we just want the 2 highest feedback values  We can create a Boolean expressions, such as ((Allegiance = Enemy) ^ (Defense = Weak)) v ((Allegiance = Enemy) ^ (Defense = Medium))
  • 38. Summary  Decision trees can be used to help predict the future  The trees are easy to understand  Decision trees work more efficiently with discrete attributes  The trees may suffer from error propagation