SlideShare a Scribd company logo
AN INTRODUCTION
TO
DECISION TREES
Prepared for:
CIS595 Knowledge Discovery and Data Mining
Professor Vasileios Megalooikonomou
Presented by:
Thomas Mahoney
Learning Systems
 Learning systems consider
 Solved cases - cases assigned to a class
 Information from the solved cases - general
decision rules
 Rules - implemented in a model
 Model - applied to new cases
 Different types of models - present their results
in various forms
 Linear discriminant model - mathematical
equation (p = ax1 + bx2 + cx3 + dx4 + ex5).
 Presentation comprehensibility
Data Classification and Prediction
 Data classification
 classification
 prediction
 Methods of classification
 decision tree induction
 Bayesian classification
 backpropagation
 association rule mining
Data Classification and Prediction
 Method creates model from a set of
training data
 individual data records (samples, objects,
tuples)
 records can each be described by its
attributes
 attributes arranged in a set of classes
 supervised learning - each record is assigned
a class label
Data Classification and Prediction
 Model form representations
 mathematical formulae
 classification rules
 decision trees
 Model utility for data classification
 degree of accuracy
 predict unknown outcomes for a new (no-test)
data set
 classification - outcomes always discrete or
nominal values
 regression may contain continuous or ordered
values
Description of
Decision Rules or Trees
 Intuitive appeal for users
 Presentation Forms
 “if, then” statements (decision rules)
 graphically - decision trees
What They Look Like
 Works like a flow chart
 Looks like an upside down tree
 Nodes
 appear as rectangles or circles
 represent test or decision
 Lines or branches - represent outcome of
a test
 Circles - terminal (leaf) nodes
 Top or starting node- root node
 Internal nodes - rectangles
decisiontrees.ppt
An Example
 Bank - loan application
 Classify application
 approved class
 denied class
 Criteria - Target Class approved if 3 binary
attributes have certain value:
 (a) borrower has good credit history (credit rating in excess
of some threshold)
 (b) loan amount less than some percentage of collateral value
(e.g., 80% home value)
 (c) borrower has income to make payments on loan
 Possible scenarios = 32 = 8
 If the parameters for splitting the nodes can be adjusted, the
number of scenarios grows exponentially.
How They Work
 Decision rules - partition sample of data
 Terminal node (leaf) indicates the class assignment
 Tree partitions samples into mutually exclusive groups
 One group for each terminal node
 All paths
 start at the root node
 end at a leaf
 Each path represents a decision rule
 joining (AND) of all the tests along that path
 separate paths that result in the same class are disjunctions (ORs)
 All paths - mutually exclusive
 for any one case - only one path will be followed
 false decisions on the left branch
 true decisions on the right branch
Disjunctive Normal Form
 Non-terminal node - model identifies an
attribute to be tested
 test splits attribute into mutually exclusive
disjoint sets
 splitting continues until a node - one class
(terminal node or leaf)
 Structure - disjunctive normal form
 limits form of a rule to conjunctions (adding)
of terms
 allows disjunction (or-ing) over a set of rules
Geometry
 Disjunctive normal form
 Fits shapes of decision boundaries between classes
 Classes formed by lines parallel to axes
 Result - rectangular shaped class regions
Binary Trees
 Characteristics
 two branches leave each non-terminal
node
 those two branches cover outcomes of
the test
 exactly one branch enters each non-
root node
 there are n terminal nodes
 there are n-1 non-terminal nodes
Nonbinary Trees
 Characteristics
 two or more branches leave each non-
terminal node
 those branches cover outcomes of the
test
 exactly one branch enters each non-
root node
 there are n terminal nodes
 there are n-1 non-terminal nodes
Goal
 Dual goal - Develop tree that
 is small
 classifies and predicts class with accuracy
 Small size
 a smaller tree more easily understood
 smaller tree less susceptible to overfitting
 large tree less information regarding
classifying and predicting cases
Rule Induction
 Process of building the decision tree or
ascertaining the decision rules
 tree induction
 rule induction
 induction
 Decision tree algorithms
 induce decision trees recursively
 from the root (top) down - greedy approach
 established basic algorithms include ID3 and
C4.5
Discrete vs. Continuous Attributes
 Continuous variables attributes -
problems for decision trees
 increase computational complexity of the task
 promote prediction inaccuracy
 lead to overfitting of data
 Convert continuous variables into discrete
intervals
 “greater than or equal to” and “less than”
 optimal solution for conversion
 difficult to determine discrete intervals ideal
• size
• number
Making the Split
 Models induce a tree by recursively
selecting and subdividing attributes
 random selection - noisy variables
 inefficient production of inaccurate trees
 Efficient models
 examine each variable
 determine which will improve accuracy of
entire tree
 problem - this approach decides best split
without considering subsequent splits
Evaluating the Splits
Measures of impurity or its inverse, goodness reduce
impurity or degree of randomness at each node popular
measures include:
Entropy Function
- pj log pj
j
Gini Index
1 -  p2
j
j
Twoing Rule
k
(TL /n) * (TR /n) * ( Li TL -
Ri/ TR)2
Evaluating the Splits
Max Minority
Sum of Variances
Overfitting
 Error rate in predicting the correct
class for new cases
 overfitting of test data
 very low apparent error rate
 high actual error rate
Optimal Size
 Certain minimal size smaller tree
 higher apparent error rate
 lower actual error rate
 Goal
 identify threshold
 minimize actual error rate
 achieve greatest predictive accuracy
Ending Tree Growth
 Grow the tree until
 additional splitting produces no
significant information gain
 statistical test - a chi-squared test
 problem - trees that are too small
 only compares one split with the next
descending split
Pruning
 Grow large tree
 reduce its size by eliminating or pruning weak
branches step by step
 continue until minimum true error rate
 Pruning Methods
 reduced-error pruning
 divides samples into test set and training set
 training set is used to produce the fully
expanded tree
 tree is then tested using the test set
 weak branches are pruned
 stop when no more improvement
Pruning
 Resampling
 5 - fold cross-validation
 80% cases used for training; remainder for
testing
 Weakest-link or cost-complexity pruning
 trim weakest link ( produces the smallest
increase in the apparent error rate)
 method can be combined with resampling
Variations and Enhancements
to Basic Decision Trees
 Multivariate or Oblique Trees
 CART-LC - CART with Linear
Combinations
 LMDT - Linear Machine Decision Trees
 SADT - Simulated Annealing of Decision
Trees
 OC1 - Oblique Classifier 1
Evaluating Decision Trees
 Method’s Appropriateness
 Data set or type
 Criteria
 accuracy - predict class label for new data
 scalability
• performs model generation and prediction functions
• large data sets
• satisfactory speed
 robustness
• perform well despite noisy or missing data
 intuitive appeal
• results easily understood
• promotes decision making
Decision Tree Limitations
 No backtracking
 local optimal solution not global optimal
solution
 lookahead features may give us better trees
 Rectangular-shaped geometric regions
 in two-dimensional space
• regions bounded by lines parallel to the x- and y-
axes
 some linear relationships not parallel to the
axes
Conclusions
 Utility
 analyze classified data
 produce
 accurate and easily understood classification
rules
 with good predictive value
Improvements
 Limitations being addressed
 multivariate discrimination - oblique trees
 data mining techniques
Bibliography
 A System for Induction of Oblique Decision Trees, Sreerama K. Murthy,
Simon Kasif, Steven Salzberg, Journal of Artificial Intelligence Research 2
(1994) 1-32.
 Automatic Construction of Decision Trees from Data: A Multi-Disciplinary
Survey, Sreerama K. Murthy, Data Mining and Knowledge Discovery, 2.
345-389 (1998) Kluwer Academic Publishers.
 Classification and Regression Trees, Leo Breiman, Jerome Friedman,
Richard Olshen and Charles Stone, 1984, Wadsworth Int. Group.
 Computer Systems That Learn, Sholom M. Weiss and Casimer A.
Kulikowski, 1991, Morgan Kaufman.
 Data Mining, Concepts and Techniques, Jiawei Han and Micheline Kamber,
2001, Morgan Kaufman.
 Introduction to Mathematical Techniques in Pattern Recognition, Harry C.
Andrews, 1972, Wiley-Interscience.
 Machine Learning, Tom M. Mitchell, 1997, McGraw-Hill.

More Related Content

PPTX
Classification
PPTX
Classification
PDF
classification in data mining and data warehousing.pdf
PPTX
decision tree machine learning model for classification
PDF
22PCOAM16 ML Unit 3 Full notes PDF & QB.pdf
PDF
22PCOAM16 _ML_Unit 3 Notes & Question bank
PPTX
Machine Learning with Python unit-2.pptx
PDF
Data Mining Module 3 Business Analtics..pdf
Classification
Classification
classification in data mining and data warehousing.pdf
decision tree machine learning model for classification
22PCOAM16 ML Unit 3 Full notes PDF & QB.pdf
22PCOAM16 _ML_Unit 3 Notes & Question bank
Machine Learning with Python unit-2.pptx
Data Mining Module 3 Business Analtics..pdf

Similar to decisiontrees.ppt (20)

PDF
Data Science - Part V - Decision Trees & Random Forests
PPT
Dm bs-lec7-classification - dti
PPTX
data mining.pptx
PPT
The Use Of Decision Trees For Adaptive Item
PPTX
Machine learning session6(decision trees random forrest)
PDF
IMAGE CLASSIFICATION USING DIFFERENT CLASSICAL APPROACHES
PDF
A Decision Tree Based Classifier for Classification & Prediction of Diseases
PDF
Bank loan purchase modeling
PPTX
5. Machine Learning.pptx
PPT
Data Mining In Market Research
PPT
Data Mining In Market Research
 
PPT
Data Mining in Market Research
PPT
Introduction to cart_2009
PPTX
MACHINE LEARNING - ENTROPY & INFORMATION GAINpptx
PPTX
dataminingclassificationprediction123 .pptx
PPTX
Decision Tree - C4.5&CART
PPTX
Machine learning tree models for classification
PPT
Using Tree algorithms on machine learning
PPT
classification in data warehouse and mining
PPTX
NSL KDD Cup 99 dataset Anomaly Detection using Machine Learning Technique
Data Science - Part V - Decision Trees & Random Forests
Dm bs-lec7-classification - dti
data mining.pptx
The Use Of Decision Trees For Adaptive Item
Machine learning session6(decision trees random forrest)
IMAGE CLASSIFICATION USING DIFFERENT CLASSICAL APPROACHES
A Decision Tree Based Classifier for Classification & Prediction of Diseases
Bank loan purchase modeling
5. Machine Learning.pptx
Data Mining In Market Research
Data Mining In Market Research
 
Data Mining in Market Research
Introduction to cart_2009
MACHINE LEARNING - ENTROPY & INFORMATION GAINpptx
dataminingclassificationprediction123 .pptx
Decision Tree - C4.5&CART
Machine learning tree models for classification
Using Tree algorithms on machine learning
classification in data warehouse and mining
NSL KDD Cup 99 dataset Anomaly Detection using Machine Learning Technique
Ad

Recently uploaded (20)

PDF
Foundation of Data Science unit number two notes
PPTX
Computer network topology notes for revision
PPT
Reliability_Chapter_ presentation 1221.5784
PPTX
Major-Components-ofNKJNNKNKNKNKronment.pptx
PPTX
The THESIS FINAL-DEFENSE-PRESENTATION.pptx
PPTX
ALIMENTARY AND BILIARY CONDITIONS 3-1.pptx
PDF
Mega Projects Data Mega Projects Data
PPTX
Introduction-to-Cloud-ComputingFinal.pptx
PPT
Quality review (1)_presentation of this 21
PDF
TRAFFIC-MANAGEMENT-AND-ACCIDENT-INVESTIGATION-WITH-DRIVING-PDF-FILE.pdf
PDF
Introduction to Business Data Analytics.
PPTX
1_Introduction to advance data techniques.pptx
PPTX
Data_Analytics_and_PowerBI_Presentation.pptx
PPT
Chapter 3 METAL JOINING.pptnnnnnnnnnnnnn
PPTX
DISORDERS OF THE LIVER, GALLBLADDER AND PANCREASE (1).pptx
PPTX
MODULE 8 - DISASTER risk PREPAREDNESS.pptx
PPTX
STUDY DESIGN details- Lt Col Maksud (21).pptx
PPTX
CEE 2 REPORT G7.pptxbdbshjdgsgjgsjfiuhsd
PPT
Chapter 2 METAL FORMINGhhhhhhhjjjjmmmmmmmmm
PDF
22.Patil - Early prediction of Alzheimer’s disease using convolutional neural...
Foundation of Data Science unit number two notes
Computer network topology notes for revision
Reliability_Chapter_ presentation 1221.5784
Major-Components-ofNKJNNKNKNKNKronment.pptx
The THESIS FINAL-DEFENSE-PRESENTATION.pptx
ALIMENTARY AND BILIARY CONDITIONS 3-1.pptx
Mega Projects Data Mega Projects Data
Introduction-to-Cloud-ComputingFinal.pptx
Quality review (1)_presentation of this 21
TRAFFIC-MANAGEMENT-AND-ACCIDENT-INVESTIGATION-WITH-DRIVING-PDF-FILE.pdf
Introduction to Business Data Analytics.
1_Introduction to advance data techniques.pptx
Data_Analytics_and_PowerBI_Presentation.pptx
Chapter 3 METAL JOINING.pptnnnnnnnnnnnnn
DISORDERS OF THE LIVER, GALLBLADDER AND PANCREASE (1).pptx
MODULE 8 - DISASTER risk PREPAREDNESS.pptx
STUDY DESIGN details- Lt Col Maksud (21).pptx
CEE 2 REPORT G7.pptxbdbshjdgsgjgsjfiuhsd
Chapter 2 METAL FORMINGhhhhhhhjjjjmmmmmmmmm
22.Patil - Early prediction of Alzheimer’s disease using convolutional neural...
Ad

decisiontrees.ppt

  • 1. AN INTRODUCTION TO DECISION TREES Prepared for: CIS595 Knowledge Discovery and Data Mining Professor Vasileios Megalooikonomou Presented by: Thomas Mahoney
  • 2. Learning Systems  Learning systems consider  Solved cases - cases assigned to a class  Information from the solved cases - general decision rules  Rules - implemented in a model  Model - applied to new cases  Different types of models - present their results in various forms  Linear discriminant model - mathematical equation (p = ax1 + bx2 + cx3 + dx4 + ex5).  Presentation comprehensibility
  • 3. Data Classification and Prediction  Data classification  classification  prediction  Methods of classification  decision tree induction  Bayesian classification  backpropagation  association rule mining
  • 4. Data Classification and Prediction  Method creates model from a set of training data  individual data records (samples, objects, tuples)  records can each be described by its attributes  attributes arranged in a set of classes  supervised learning - each record is assigned a class label
  • 5. Data Classification and Prediction  Model form representations  mathematical formulae  classification rules  decision trees  Model utility for data classification  degree of accuracy  predict unknown outcomes for a new (no-test) data set  classification - outcomes always discrete or nominal values  regression may contain continuous or ordered values
  • 6. Description of Decision Rules or Trees  Intuitive appeal for users  Presentation Forms  “if, then” statements (decision rules)  graphically - decision trees
  • 7. What They Look Like  Works like a flow chart  Looks like an upside down tree  Nodes  appear as rectangles or circles  represent test or decision  Lines or branches - represent outcome of a test  Circles - terminal (leaf) nodes  Top or starting node- root node  Internal nodes - rectangles
  • 9. An Example  Bank - loan application  Classify application  approved class  denied class  Criteria - Target Class approved if 3 binary attributes have certain value:  (a) borrower has good credit history (credit rating in excess of some threshold)  (b) loan amount less than some percentage of collateral value (e.g., 80% home value)  (c) borrower has income to make payments on loan  Possible scenarios = 32 = 8  If the parameters for splitting the nodes can be adjusted, the number of scenarios grows exponentially.
  • 10. How They Work  Decision rules - partition sample of data  Terminal node (leaf) indicates the class assignment  Tree partitions samples into mutually exclusive groups  One group for each terminal node  All paths  start at the root node  end at a leaf  Each path represents a decision rule  joining (AND) of all the tests along that path  separate paths that result in the same class are disjunctions (ORs)  All paths - mutually exclusive  for any one case - only one path will be followed  false decisions on the left branch  true decisions on the right branch
  • 11. Disjunctive Normal Form  Non-terminal node - model identifies an attribute to be tested  test splits attribute into mutually exclusive disjoint sets  splitting continues until a node - one class (terminal node or leaf)  Structure - disjunctive normal form  limits form of a rule to conjunctions (adding) of terms  allows disjunction (or-ing) over a set of rules
  • 12. Geometry  Disjunctive normal form  Fits shapes of decision boundaries between classes  Classes formed by lines parallel to axes  Result - rectangular shaped class regions
  • 13. Binary Trees  Characteristics  two branches leave each non-terminal node  those two branches cover outcomes of the test  exactly one branch enters each non- root node  there are n terminal nodes  there are n-1 non-terminal nodes
  • 14. Nonbinary Trees  Characteristics  two or more branches leave each non- terminal node  those branches cover outcomes of the test  exactly one branch enters each non- root node  there are n terminal nodes  there are n-1 non-terminal nodes
  • 15. Goal  Dual goal - Develop tree that  is small  classifies and predicts class with accuracy  Small size  a smaller tree more easily understood  smaller tree less susceptible to overfitting  large tree less information regarding classifying and predicting cases
  • 16. Rule Induction  Process of building the decision tree or ascertaining the decision rules  tree induction  rule induction  induction  Decision tree algorithms  induce decision trees recursively  from the root (top) down - greedy approach  established basic algorithms include ID3 and C4.5
  • 17. Discrete vs. Continuous Attributes  Continuous variables attributes - problems for decision trees  increase computational complexity of the task  promote prediction inaccuracy  lead to overfitting of data  Convert continuous variables into discrete intervals  “greater than or equal to” and “less than”  optimal solution for conversion  difficult to determine discrete intervals ideal • size • number
  • 18. Making the Split  Models induce a tree by recursively selecting and subdividing attributes  random selection - noisy variables  inefficient production of inaccurate trees  Efficient models  examine each variable  determine which will improve accuracy of entire tree  problem - this approach decides best split without considering subsequent splits
  • 19. Evaluating the Splits Measures of impurity or its inverse, goodness reduce impurity or degree of randomness at each node popular measures include: Entropy Function - pj log pj j Gini Index 1 -  p2 j j Twoing Rule k (TL /n) * (TR /n) * ( Li TL - Ri/ TR)2
  • 20. Evaluating the Splits Max Minority Sum of Variances
  • 21. Overfitting  Error rate in predicting the correct class for new cases  overfitting of test data  very low apparent error rate  high actual error rate
  • 22. Optimal Size  Certain minimal size smaller tree  higher apparent error rate  lower actual error rate  Goal  identify threshold  minimize actual error rate  achieve greatest predictive accuracy
  • 23. Ending Tree Growth  Grow the tree until  additional splitting produces no significant information gain  statistical test - a chi-squared test  problem - trees that are too small  only compares one split with the next descending split
  • 24. Pruning  Grow large tree  reduce its size by eliminating or pruning weak branches step by step  continue until minimum true error rate  Pruning Methods  reduced-error pruning  divides samples into test set and training set  training set is used to produce the fully expanded tree  tree is then tested using the test set  weak branches are pruned  stop when no more improvement
  • 25. Pruning  Resampling  5 - fold cross-validation  80% cases used for training; remainder for testing  Weakest-link or cost-complexity pruning  trim weakest link ( produces the smallest increase in the apparent error rate)  method can be combined with resampling
  • 26. Variations and Enhancements to Basic Decision Trees  Multivariate or Oblique Trees  CART-LC - CART with Linear Combinations  LMDT - Linear Machine Decision Trees  SADT - Simulated Annealing of Decision Trees  OC1 - Oblique Classifier 1
  • 27. Evaluating Decision Trees  Method’s Appropriateness  Data set or type  Criteria  accuracy - predict class label for new data  scalability • performs model generation and prediction functions • large data sets • satisfactory speed  robustness • perform well despite noisy or missing data  intuitive appeal • results easily understood • promotes decision making
  • 28. Decision Tree Limitations  No backtracking  local optimal solution not global optimal solution  lookahead features may give us better trees  Rectangular-shaped geometric regions  in two-dimensional space • regions bounded by lines parallel to the x- and y- axes  some linear relationships not parallel to the axes
  • 29. Conclusions  Utility  analyze classified data  produce  accurate and easily understood classification rules  with good predictive value Improvements  Limitations being addressed  multivariate discrimination - oblique trees  data mining techniques
  • 30. Bibliography  A System for Induction of Oblique Decision Trees, Sreerama K. Murthy, Simon Kasif, Steven Salzberg, Journal of Artificial Intelligence Research 2 (1994) 1-32.  Automatic Construction of Decision Trees from Data: A Multi-Disciplinary Survey, Sreerama K. Murthy, Data Mining and Knowledge Discovery, 2. 345-389 (1998) Kluwer Academic Publishers.  Classification and Regression Trees, Leo Breiman, Jerome Friedman, Richard Olshen and Charles Stone, 1984, Wadsworth Int. Group.  Computer Systems That Learn, Sholom M. Weiss and Casimer A. Kulikowski, 1991, Morgan Kaufman.  Data Mining, Concepts and Techniques, Jiawei Han and Micheline Kamber, 2001, Morgan Kaufman.  Introduction to Mathematical Techniques in Pattern Recognition, Harry C. Andrews, 1972, Wiley-Interscience.  Machine Learning, Tom M. Mitchell, 1997, McGraw-Hill.