SlideShare a Scribd company logo
Mohd Noor Abdul Hamid, Ph.D
Universiti Utara Malaysia
mohdnoor@uum.edu.my
After this class, you should be able to :
Explain the C4.5 Algorithm
Use the algorithm to develop a
Decision Tree
mohdnoor@uum.edu.my
Decision tree are constructed using
only those attributes best able to
differentiate the concepts to be
learned.
Main goal is to minimize the number
of tree levels and tree nodes 
maximizing data generalization
mohdnoor@uum.edu.my
The C4.5 Algorithm
Let T be the set of training instances
Choose an attribute that best
differentiates the instances contained in T.
mohdnoor@uum.edu.my
Let T be the set of training instances
Choose an attribute that best
differentiates the instances contained in T.
The C4.5 Algorithm
Create a tree node whose value is
the chosen attribute.
Create child links from this node
where each link represents a unique
value for the chosen attribute.
Use the child link values to further
subdivide the instances into
subclasses.
Let T be the set of training instances
Choose an attribute that best
differentiates the instances contained in T.
Create a tree node whose value is
the chosen attribute.
Create child links from this node
where each link represents a unique
value for the chosen attribute.
Use the child link values to further
subdivide the instances into
subclasses.
The C4.5 Algorithm
Instances in the subclass
satisfy predefine criteria
OR
Remaining attributes choice
for the path is null
Create a tree node whose value is
the chosen attribute.
Create child links from this node
where each link represents a unique
value for the chosen attribute.
Use the child link values to further
subdivide the instances into
subclasses.
Instances in the subclass
satisfy predefine criteria
OR
Remaining attributes choice
for the path is null
Specify the classification for new
instances following this decision path
Y
The C4.5 Algorithm
Use the child link values to further
subdivide the instances into
subclasses.
Instances in the subclass
satisfy predefine criteria
OR
Remaining attributes choice
for the path is null
Specify the classification for new
instances following this decision path
Y
The C4.5 Algorithm
N
Let T be the set of training instances
Choose an attribute that best
differentiates the instances contained in T.
Create a tree node whose value is
the chosen attribute.
Create child links from this node
where each link represents a unique
value for the chosen attribute.
Use the child link values to further
The C4.5 Algorithm
END
Exercise
Exercise : The Scenario
• BIGG Credit Card company wish to
develop a predictive model in order to
identify customers who are likely to take
advantage of the life insurance promotion
– so that they can mail the promotional
item to the potential customer.
Exercise : The Scenario
The model will be develop using the data
stored in the credit card promotion database.
The data contains information obtained about
customers through their initial credit card
application as well as data about whether
these individual have accepted various
promotional offerings sponsored by the
company
Dataset
Let T be set of training instances
Exercise
We follow our previous work with
creditcardpromotion.xls
The dataset consist of 15 instances
(observations)- T, each having 10 attributes
(variables) for our example, the input
attributes are limited to 5. Why??
Step 1
Decision tree are
constructed using only
those attributes best able to
differentiate the concepts to
be learned.
Let T be set of training instances
Exercise
Step 1
Age Interval 19 – 55 years
Independent
Variables
(Inputs)
Sex Nominal Male
Female
Income
Range
Ordinal 20 – 30K
30 – 40K
40 – 50K
50 – 60K
Credit Card
Insurance
Binary Yes
No
Life Insurance
Promotion
Binary Yes
No
Dependent
Variables
(Target / Output)
Choose an attribute that best differentiates
the instances contained in T
Exercise
C4.5 uses measure taken from information theory
to help with the attribute selection process.
The idea is; for any choice point in the tree, C4.5
selects the attributes that splits the data so as to
show the largest amount of gain in information.
We need to choose an input attribute to best
differentiate the instances in T  our choices are
among :
- Income Range
- Credit Card Insurance
- Sex
- Age
Step 2
INSURED
Choose an attribute that best differentiates
the instances contained in T
Exercise
Goodness Score for each attribute is
calculated to determined which attribute
best differentiate the training instances (T).
Step 2
Sum of the most frequently encountered class in each branch (level) ÷ T
Number of branches (levels)
Goodness Score
We can develop a partial tree for each
attribute in order to calculate the
Goodness Score.
Sum of the most frequently encountered class in each branch (level) ÷ T
Number of branches (levels)
Goodness Score
Accuracy
Choose an attribute that best differentiates
the instances contained in T
Exercise
Step 2
2a. Income Range
Choose an attribute that best differentiates
the instances contained in T
Exercise
Step 2
2a. Income Range
Income
Range
2 Yes
2 No
20 – 30K
4 Yes
1 No
30-40K
1 Yes
3 No
40-50K
2 Yes
50-60K
Choose an attribute that best differentiates
the instances contained in T
Exercise
Step 2
2a. Income Range
Sum of the most frequently encountered class in each branch (level) ÷ T
Number of branches (levels)
Goodness Score
= (2 + 4 + 3 + 2) ÷ 15
4
= 0.183
Choose an attribute that best differentiates
the instances contained in T
Exercise
Step 2
2b. Credit Card Insurance
Choose an attribute that best differentiates
the instances contained in T
Exercise
Step 2
2b. Credit Card Insurance
CC
Insurance
6 Yes
6 No
No
3 Yes
0 No
Yes
Choose an attribute that best differentiates
the instances contained in T
Exercise
Step 2
2b. Credit Card Insurance
Sum of the most frequently encountered class in each branch (level) ÷ T
Number of branches (levels)
Goodness Score
= (6 + 3) ÷ 15
2
= 0.30
Choose an attribute that best differentiates
the instances contained in T
Exercise
Step 2
2c. Sex
Choose an attribute that best differentiates
the instances contained in T
Exercise
Step 2
2c. Sex
Sex
3 Yes
5 No
Male
6 Yes
1 No
Female
Choose an attribute that best differentiates
the instances contained in T
Exercise
Step 2
2c. Sex
Sum of the most frequently encountered class in each branch (level) ÷ T
Number of branches (levels)
Goodness Score
= (6 + 5) ÷ 15
2
= 0.367
Choose an attribute that best differentiates
the instances contained in T
Exercise
Step 2
2d. Age
Age is an interval variable (numeric),
therefore we need to determine where is
the best split point for all the values.
For this example, we are opt for binary split.
Why??
Main goal is to minimize
the number of tree levels
and tree nodes 
maximizing data
generalization
Choose an attribute that best differentiates
the instances contained in T
Exercise
Step 2
2d. Age Steps to determine the best splitting
point for interval (numerical) attribute:
1. Sort the Age values (pair with the target – Life Ins Promo)
Age 19 27 29 35 38 39 40 41 42 43 43 43 45 55 55
LIP Y N Y Y Y Y Y Y N Y Y N N N N
Choose an attribute that best differentiates
the instances contained in T
Exercise
Step 2
2d. Age Steps to determine the best splitting point for
interval (numerical) attribute (Age):
2. A Goodness Score is computed for each possible split point
Age 19 27 29 35 38 39 40 41 42 43 43 43 45 55 55
LIP Y N Y Y Y Y Y Y N Y Y N N N N
1 Yes
0 No
8 Yes
6 No
Goodness Score = (1 + 8) ÷ 15 = 0.30
2
Choose an attribute that best differentiates
the instances contained in T
Exercise
Step 2
2d. Age Steps to determine the best splitting point for
interval (numerical) attribute (Age):
2. A Goodness Score is computed for each possible split point
Age 19 27 29 35 38 39 40 41 42 43 43 43 45 55 55
LIP Y N Y Y Y Y Y Y N Y Y N N N N
1 Yes
1 No
8 Yes
5 No
Goodness Score = (1 + 8) ÷ 15 = 0.30
2
Choose an attribute that best differentiates
the instances contained in T
Exercise
Step 2
2d. Age Steps to determine the best splitting point for
interval (numerical) attribute (Age):
2. A Goodness Score is computed for each possible split point
This process continues until a score for the split
between 45 and 55 is obtained.
Split point with the highest Goodness Score is
chosen  43.
Choose an attribute that best differentiates
the instances contained in T
Exercise
Step 2
2d. Age Steps to determine the best splitting point for
interval (numerical) attribute (Age):
2. A Goodness Score is computed for each possible split point
Age 19 27 29 35 38 39 40 41 42 43 43 43 45 55 55
LIP Y N Y Y Y Y Y Y N Y Y N N N N
9 Yes
3 No
0 Yes
3 No
Goodness Score = (9 + 3) ÷ 15 = 0.40
2
Choose an attribute that best differentiates
the instances contained in T
Exercise
Step 2
2d. Age
Age
9 Yes
3 No
≤ 43
0 Yes
3 No
> 43
Choose an attribute that best differentiates
the instances contained in T
Exercise
Step 2
Overall Goodness Score for each input attribute:
Attribute Goodness Score
Age 0.4
Sex 0.367
Credit Card Insurance 0.3
Income Range 0.183

Therefore the attribute Age is chosen as the top
level node
• Create a tree node whose value is the
chosen attribute.
• Create child links from this node where each
link represents a unique value for the chosen
attribute.
Exercise
Step 3
Exercise
Step 3
Age
9 Yes
3 No
≤ 43
0 Yes
3 No
> 43
For each subclass :
a. If the instances in the subclass satisfy the
predefined criteria or if the set of remaining
attribute choices for this path is null, specify
the classification for new instances following
this decision.
b. If the subclass does not satisfy the predefined
criteria and there is at least one attribute to
further subdivide the path of the three, let T
be the current set of subclass instances and
return to step 2.
Exercise
Step 3
mohdnoor@uum.edu.my
Exercise
Step 3
Age
9 Yes
3 No
≤ 43
0 Yes
3 No
> 43
Does not satisfy the predefine criteria.
Subdivide!
Satisfy the predefine criteria.
Classification :
Life Insurance = No
mohdnoor@uum.edu.my
Step 3
Age
≤ 43
0 Yes
3 No
> 43
Life Insurance = Yes
Sex
Exercise
Female
6 Yes
0 No
Male
3 Yes
3 No
Life Insurance = Yes Subdivide
mohdnoor@uum.edu.my
Age
≤ 43
0 Yes
3 No
> 43
Life Insurance = Yes
Sex
Female
6 Yes
0 No
Male
Life Insurance = Yes
Exercise
Step 3
CC
Insurance
1 Yes
3 No
No
2 Yes
0 No
Yes
Life Insurance = No Life Insurance = Yes
mohdnoor@uum.edu.my
Age
≤ 43
0 Yes
3 No
> 43
Life Insurance = No
Sex
Female
6 Yes
0 No
Male
Life Insurance = Yes
Exercise
CC
Insurance
1 Yes
3 No
No
2 Yes
0 No
Yes
Life Insurance = No Life Insurance = Yes
The Decision Tree:
Life Insurance Promo
Exercise
The Decision Tree:
1. Our Decision Tree is able to accurately classify 14
out of 15 training instances.
2. Therefore, the accuracy of our model is 93%
Assignment
• Based on the Decision Tree
model for the Life Insurance
Promotion, develop
application (program) using
any tools you are familiar with.
• Submit your code and report
next week!
mohdnoor@uum.edu.my

More Related Content

PPTX
Decision Trees
PDF
Research Methodology
PPTX
Introduction to XGboost
PPTX
Lms learning management system a game changer
PPTX
Face to face and online learning
PPT
2.2 decision tree
PPT
Signal & systems
PPT
tkinter final ppt.ppt
Decision Trees
Research Methodology
Introduction to XGboost
Lms learning management system a game changer
Face to face and online learning
2.2 decision tree
Signal & systems
tkinter final ppt.ppt

What's hot (20)

PDF
Decision trees in Machine Learning
PDF
Decision tree lecture 3
ODP
Machine Learning With Logistic Regression
PDF
Data Mining: Association Rules Basics
PPTX
Decision tree
PPTX
Id3,c4.5 algorithim
PPTX
Id3 algorithm
PDF
Stochastic gradient descent and its tuning
PDF
Bias and variance trade off
PPT
Slide3.ppt
PPTX
Principal component analysis
PDF
Iris data analysis example in R
PPTX
Decision Tree Learning
PDF
Logistic regression in Machine Learning
PPTX
Data warehouse architecture
PPTX
Classification decision tree
PPTX
Machine Learning: Bias and Variance Trade-off
PPT
1.2 steps and functionalities
PDF
Cluster analysis
Decision trees in Machine Learning
Decision tree lecture 3
Machine Learning With Logistic Regression
Data Mining: Association Rules Basics
Decision tree
Id3,c4.5 algorithim
Id3 algorithm
Stochastic gradient descent and its tuning
Bias and variance trade off
Slide3.ppt
Principal component analysis
Iris data analysis example in R
Decision Tree Learning
Logistic regression in Machine Learning
Data warehouse architecture
Classification decision tree
Machine Learning: Bias and Variance Trade-off
1.2 steps and functionalities
Cluster analysis
Ad

Similar to Decision tree Using c4.5 Algorithm (20)

PPT
Data Mining:Concepts and Techniques, Chapter 8. Classification: Basic Concepts
PPT
Classification (ML).ppt
PPT
ClassificationOfMachineLearninginCSE.ppt
PPTX
unit classification.pptx
PPTX
Machine learning
PPT
Data Mining Concepts and Techniques.ppt
PPT
Data Mining Concepts and Techniques.ppt
PPT
classification_by_decission_tree_induction_iv1.ppt
PPT
Unit 3classification
PDF
Machine Learning with Python- Machine Learning Algorithms- Decision Tree.pdf
PDF
Classification, Attribute Selection, Classifiers- Decision Tree, ID3,C4.5,Nav...
PPTX
Decision Tree machine learning classification .pptx
PPTX
Decision Tree data mining algorithm .pptx
PPTX
Dataming-chapter-7-Classification-Basic.pptx
PDF
Implementation of Improved ID3 Algorithm to Obtain more Optimal Decision Tree.
PPTX
Decision tree
PPT
classification in data warehouse and mining
PDF
08 classbasic
PPTX
07 learning
PPT
08 classbasic
Data Mining:Concepts and Techniques, Chapter 8. Classification: Basic Concepts
Classification (ML).ppt
ClassificationOfMachineLearninginCSE.ppt
unit classification.pptx
Machine learning
Data Mining Concepts and Techniques.ppt
Data Mining Concepts and Techniques.ppt
classification_by_decission_tree_induction_iv1.ppt
Unit 3classification
Machine Learning with Python- Machine Learning Algorithms- Decision Tree.pdf
Classification, Attribute Selection, Classifiers- Decision Tree, ID3,C4.5,Nav...
Decision Tree machine learning classification .pptx
Decision Tree data mining algorithm .pptx
Dataming-chapter-7-Classification-Basic.pptx
Implementation of Improved ID3 Algorithm to Obtain more Optimal Decision Tree.
Decision tree
classification in data warehouse and mining
08 classbasic
07 learning
08 classbasic
Ad

More from Mohd. Noor Abdul Hamid (14)

PPSX
Chapter 5 case study
PPSX
Chapter 4 common features of qualitative data analysis
PPSX
Chapter 2 incorporating theory and conducting literature search and review
PPTX
System and Information Quality
PPTX
The process of making a decision
PPTX
Introduction to DSS
PPSX
Classification Using Decision tree
PPSX
Types of functions and their domain & range - mohd noor
PPSX
Limit - Mohd Noor
PPSX
Lagrange Multiplier - Mohd Moor
PPSX
Introduction to Function, Domain and Range - Mohd Noor
PPSX
Introduction to Exponential Function - Mohd noor
PPSX
Domain & Range (from graph) - Mohd Noor
PPSX
Differentiation using First Principle - By Mohd Noor Abdul Hamid
Chapter 5 case study
Chapter 4 common features of qualitative data analysis
Chapter 2 incorporating theory and conducting literature search and review
System and Information Quality
The process of making a decision
Introduction to DSS
Classification Using Decision tree
Types of functions and their domain & range - mohd noor
Limit - Mohd Noor
Lagrange Multiplier - Mohd Moor
Introduction to Function, Domain and Range - Mohd Noor
Introduction to Exponential Function - Mohd noor
Domain & Range (from graph) - Mohd Noor
Differentiation using First Principle - By Mohd Noor Abdul Hamid

Recently uploaded (20)

PDF
Recruitment and Placement PPT.pdfbjfibjdfbjfobj
PPT
Reliability_Chapter_ presentation 1221.5784
PPTX
05. PRACTICAL GUIDE TO MICROSOFT EXCEL.pptx
PPT
Quality review (1)_presentation of this 21
PPT
Miokarditis (Inflamasi pada Otot Jantung)
PPTX
Introduction to Knowledge Engineering Part 1
PPTX
Introduction to Basics of Ethical Hacking and Penetration Testing -Unit No. 1...
PPTX
STUDY DESIGN details- Lt Col Maksud (21).pptx
PPTX
advance b rammar.pptxfdgdfgdfsgdfgsdgfdfgdfgsdfgdfgdfg
PDF
Launch Your Data Science Career in Kochi – 2025
PPTX
IB Computer Science - Internal Assessment.pptx
PPTX
Introduction to Firewall Analytics - Interfirewall and Transfirewall.pptx
PPTX
IBA_Chapter_11_Slides_Final_Accessible.pptx
PPTX
Major-Components-ofNKJNNKNKNKNKronment.pptx
PPTX
iec ppt-1 pptx icmr ppt on rehabilitation.pptx
PDF
168300704-gasification-ppt.pdfhghhhsjsjhsuxush
PDF
Fluorescence-microscope_Botany_detailed content
PPTX
Introduction-to-Cloud-ComputingFinal.pptx
PDF
Clinical guidelines as a resource for EBP(1).pdf
PPTX
Global journeys: estimating international migration
Recruitment and Placement PPT.pdfbjfibjdfbjfobj
Reliability_Chapter_ presentation 1221.5784
05. PRACTICAL GUIDE TO MICROSOFT EXCEL.pptx
Quality review (1)_presentation of this 21
Miokarditis (Inflamasi pada Otot Jantung)
Introduction to Knowledge Engineering Part 1
Introduction to Basics of Ethical Hacking and Penetration Testing -Unit No. 1...
STUDY DESIGN details- Lt Col Maksud (21).pptx
advance b rammar.pptxfdgdfgdfsgdfgsdgfdfgdfgsdfgdfgdfg
Launch Your Data Science Career in Kochi – 2025
IB Computer Science - Internal Assessment.pptx
Introduction to Firewall Analytics - Interfirewall and Transfirewall.pptx
IBA_Chapter_11_Slides_Final_Accessible.pptx
Major-Components-ofNKJNNKNKNKNKronment.pptx
iec ppt-1 pptx icmr ppt on rehabilitation.pptx
168300704-gasification-ppt.pdfhghhhsjsjhsuxush
Fluorescence-microscope_Botany_detailed content
Introduction-to-Cloud-ComputingFinal.pptx
Clinical guidelines as a resource for EBP(1).pdf
Global journeys: estimating international migration

Decision tree Using c4.5 Algorithm

  • 1. Mohd Noor Abdul Hamid, Ph.D Universiti Utara Malaysia mohdnoor@uum.edu.my
  • 2. After this class, you should be able to : Explain the C4.5 Algorithm Use the algorithm to develop a Decision Tree mohdnoor@uum.edu.my
  • 3. Decision tree are constructed using only those attributes best able to differentiate the concepts to be learned. Main goal is to minimize the number of tree levels and tree nodes  maximizing data generalization mohdnoor@uum.edu.my
  • 4. The C4.5 Algorithm Let T be the set of training instances Choose an attribute that best differentiates the instances contained in T. mohdnoor@uum.edu.my
  • 5. Let T be the set of training instances Choose an attribute that best differentiates the instances contained in T. The C4.5 Algorithm Create a tree node whose value is the chosen attribute. Create child links from this node where each link represents a unique value for the chosen attribute. Use the child link values to further subdivide the instances into subclasses.
  • 6. Let T be the set of training instances Choose an attribute that best differentiates the instances contained in T. Create a tree node whose value is the chosen attribute. Create child links from this node where each link represents a unique value for the chosen attribute. Use the child link values to further subdivide the instances into subclasses. The C4.5 Algorithm Instances in the subclass satisfy predefine criteria OR Remaining attributes choice for the path is null
  • 7. Create a tree node whose value is the chosen attribute. Create child links from this node where each link represents a unique value for the chosen attribute. Use the child link values to further subdivide the instances into subclasses. Instances in the subclass satisfy predefine criteria OR Remaining attributes choice for the path is null Specify the classification for new instances following this decision path Y The C4.5 Algorithm
  • 8. Use the child link values to further subdivide the instances into subclasses. Instances in the subclass satisfy predefine criteria OR Remaining attributes choice for the path is null Specify the classification for new instances following this decision path Y The C4.5 Algorithm N
  • 9. Let T be the set of training instances Choose an attribute that best differentiates the instances contained in T. Create a tree node whose value is the chosen attribute. Create child links from this node where each link represents a unique value for the chosen attribute. Use the child link values to further The C4.5 Algorithm END
  • 11. Exercise : The Scenario • BIGG Credit Card company wish to develop a predictive model in order to identify customers who are likely to take advantage of the life insurance promotion – so that they can mail the promotional item to the potential customer.
  • 12. Exercise : The Scenario The model will be develop using the data stored in the credit card promotion database. The data contains information obtained about customers through their initial credit card application as well as data about whether these individual have accepted various promotional offerings sponsored by the company Dataset
  • 13. Let T be set of training instances Exercise We follow our previous work with creditcardpromotion.xls The dataset consist of 15 instances (observations)- T, each having 10 attributes (variables) for our example, the input attributes are limited to 5. Why?? Step 1 Decision tree are constructed using only those attributes best able to differentiate the concepts to be learned.
  • 14. Let T be set of training instances Exercise Step 1 Age Interval 19 – 55 years Independent Variables (Inputs) Sex Nominal Male Female Income Range Ordinal 20 – 30K 30 – 40K 40 – 50K 50 – 60K Credit Card Insurance Binary Yes No Life Insurance Promotion Binary Yes No Dependent Variables (Target / Output)
  • 15. Choose an attribute that best differentiates the instances contained in T Exercise C4.5 uses measure taken from information theory to help with the attribute selection process. The idea is; for any choice point in the tree, C4.5 selects the attributes that splits the data so as to show the largest amount of gain in information. We need to choose an input attribute to best differentiate the instances in T  our choices are among : - Income Range - Credit Card Insurance - Sex - Age Step 2 INSURED
  • 16. Choose an attribute that best differentiates the instances contained in T Exercise Goodness Score for each attribute is calculated to determined which attribute best differentiate the training instances (T). Step 2 Sum of the most frequently encountered class in each branch (level) ÷ T Number of branches (levels) Goodness Score We can develop a partial tree for each attribute in order to calculate the Goodness Score. Sum of the most frequently encountered class in each branch (level) ÷ T Number of branches (levels) Goodness Score Accuracy
  • 17. Choose an attribute that best differentiates the instances contained in T Exercise Step 2 2a. Income Range
  • 18. Choose an attribute that best differentiates the instances contained in T Exercise Step 2 2a. Income Range Income Range 2 Yes 2 No 20 – 30K 4 Yes 1 No 30-40K 1 Yes 3 No 40-50K 2 Yes 50-60K
  • 19. Choose an attribute that best differentiates the instances contained in T Exercise Step 2 2a. Income Range Sum of the most frequently encountered class in each branch (level) ÷ T Number of branches (levels) Goodness Score = (2 + 4 + 3 + 2) ÷ 15 4 = 0.183
  • 20. Choose an attribute that best differentiates the instances contained in T Exercise Step 2 2b. Credit Card Insurance
  • 21. Choose an attribute that best differentiates the instances contained in T Exercise Step 2 2b. Credit Card Insurance CC Insurance 6 Yes 6 No No 3 Yes 0 No Yes
  • 22. Choose an attribute that best differentiates the instances contained in T Exercise Step 2 2b. Credit Card Insurance Sum of the most frequently encountered class in each branch (level) ÷ T Number of branches (levels) Goodness Score = (6 + 3) ÷ 15 2 = 0.30
  • 23. Choose an attribute that best differentiates the instances contained in T Exercise Step 2 2c. Sex
  • 24. Choose an attribute that best differentiates the instances contained in T Exercise Step 2 2c. Sex Sex 3 Yes 5 No Male 6 Yes 1 No Female
  • 25. Choose an attribute that best differentiates the instances contained in T Exercise Step 2 2c. Sex Sum of the most frequently encountered class in each branch (level) ÷ T Number of branches (levels) Goodness Score = (6 + 5) ÷ 15 2 = 0.367
  • 26. Choose an attribute that best differentiates the instances contained in T Exercise Step 2 2d. Age Age is an interval variable (numeric), therefore we need to determine where is the best split point for all the values. For this example, we are opt for binary split. Why?? Main goal is to minimize the number of tree levels and tree nodes  maximizing data generalization
  • 27. Choose an attribute that best differentiates the instances contained in T Exercise Step 2 2d. Age Steps to determine the best splitting point for interval (numerical) attribute: 1. Sort the Age values (pair with the target – Life Ins Promo) Age 19 27 29 35 38 39 40 41 42 43 43 43 45 55 55 LIP Y N Y Y Y Y Y Y N Y Y N N N N
  • 28. Choose an attribute that best differentiates the instances contained in T Exercise Step 2 2d. Age Steps to determine the best splitting point for interval (numerical) attribute (Age): 2. A Goodness Score is computed for each possible split point Age 19 27 29 35 38 39 40 41 42 43 43 43 45 55 55 LIP Y N Y Y Y Y Y Y N Y Y N N N N 1 Yes 0 No 8 Yes 6 No Goodness Score = (1 + 8) ÷ 15 = 0.30 2
  • 29. Choose an attribute that best differentiates the instances contained in T Exercise Step 2 2d. Age Steps to determine the best splitting point for interval (numerical) attribute (Age): 2. A Goodness Score is computed for each possible split point Age 19 27 29 35 38 39 40 41 42 43 43 43 45 55 55 LIP Y N Y Y Y Y Y Y N Y Y N N N N 1 Yes 1 No 8 Yes 5 No Goodness Score = (1 + 8) ÷ 15 = 0.30 2
  • 30. Choose an attribute that best differentiates the instances contained in T Exercise Step 2 2d. Age Steps to determine the best splitting point for interval (numerical) attribute (Age): 2. A Goodness Score is computed for each possible split point This process continues until a score for the split between 45 and 55 is obtained. Split point with the highest Goodness Score is chosen  43.
  • 31. Choose an attribute that best differentiates the instances contained in T Exercise Step 2 2d. Age Steps to determine the best splitting point for interval (numerical) attribute (Age): 2. A Goodness Score is computed for each possible split point Age 19 27 29 35 38 39 40 41 42 43 43 43 45 55 55 LIP Y N Y Y Y Y Y Y N Y Y N N N N 9 Yes 3 No 0 Yes 3 No Goodness Score = (9 + 3) ÷ 15 = 0.40 2
  • 32. Choose an attribute that best differentiates the instances contained in T Exercise Step 2 2d. Age Age 9 Yes 3 No ≤ 43 0 Yes 3 No > 43
  • 33. Choose an attribute that best differentiates the instances contained in T Exercise Step 2 Overall Goodness Score for each input attribute: Attribute Goodness Score Age 0.4 Sex 0.367 Credit Card Insurance 0.3 Income Range 0.183  Therefore the attribute Age is chosen as the top level node
  • 34. • Create a tree node whose value is the chosen attribute. • Create child links from this node where each link represents a unique value for the chosen attribute. Exercise Step 3
  • 35. Exercise Step 3 Age 9 Yes 3 No ≤ 43 0 Yes 3 No > 43
  • 36. For each subclass : a. If the instances in the subclass satisfy the predefined criteria or if the set of remaining attribute choices for this path is null, specify the classification for new instances following this decision. b. If the subclass does not satisfy the predefined criteria and there is at least one attribute to further subdivide the path of the three, let T be the current set of subclass instances and return to step 2. Exercise Step 3 mohdnoor@uum.edu.my
  • 37. Exercise Step 3 Age 9 Yes 3 No ≤ 43 0 Yes 3 No > 43 Does not satisfy the predefine criteria. Subdivide! Satisfy the predefine criteria. Classification : Life Insurance = No mohdnoor@uum.edu.my
  • 38. Step 3 Age ≤ 43 0 Yes 3 No > 43 Life Insurance = Yes Sex Exercise Female 6 Yes 0 No Male 3 Yes 3 No Life Insurance = Yes Subdivide mohdnoor@uum.edu.my
  • 39. Age ≤ 43 0 Yes 3 No > 43 Life Insurance = Yes Sex Female 6 Yes 0 No Male Life Insurance = Yes Exercise Step 3 CC Insurance 1 Yes 3 No No 2 Yes 0 No Yes Life Insurance = No Life Insurance = Yes mohdnoor@uum.edu.my
  • 40. Age ≤ 43 0 Yes 3 No > 43 Life Insurance = No Sex Female 6 Yes 0 No Male Life Insurance = Yes Exercise CC Insurance 1 Yes 3 No No 2 Yes 0 No Yes Life Insurance = No Life Insurance = Yes The Decision Tree: Life Insurance Promo
  • 41. Exercise The Decision Tree: 1. Our Decision Tree is able to accurately classify 14 out of 15 training instances. 2. Therefore, the accuracy of our model is 93%
  • 42. Assignment • Based on the Decision Tree model for the Life Insurance Promotion, develop application (program) using any tools you are familiar with. • Submit your code and report next week! mohdnoor@uum.edu.my