SlideShare a Scribd company logo
Constructing Comprehensible Neural Networks using Genetic Algorithms By Amr Kamel Ahmed El-Sayed Helwan University, Faculty of Engnieering, Department of Communications, Electronics, and Computers بناء شبكات عصابية سهلة التفسير باستخدام  الخواريزمات الجينية Master Thesis Under Supervision of: Prof: Sayed Mostafa Saad Dr: Alaa Hamdy
PRIA-8-2007 8th International Conference on PATTERN RECOGNITION and IMAGE ANALYSIS: NEW INFORMATION TECHNOLOGIES. INTERNATIONAL ASSOCIATION FOR PATTERN RECOGNITION (IAPR). Paper Title “ EVOLVING COMPREHENSIBLE NEURAL NETWORK TREES USING GENETIC ALGORITHMS ”. Conference proceeding Volume 1, pages [186 - 190]. Accepted for publish in international journal. Would be published in NO 4. Vol.18. English papers are published in Springer. Paper Title “ EVOLVING COMPREHENSIBLE NEURAL NETWORK TREES USING GENETIC ALGORITHMS ”. Pattern Recognition and Image Analysis. Advances in Mathematical Theory and Applications
Constructing Comprehensible Neural Networks using Genetic Algorithms This presentation demonstrates a new general neural network structure - comprehensible neural network tree ( CNNTREE ). An evolutionary algorithm is also presented for designing the CNNTREE. Results of experiments to test the generalization, the evolutionary algorithm performance and the  back propagation learning performance.
Incomprehensible Neural Networks
ANN solves various classes of  problems ANNs are well suitable to solve various classes of problems in knowledge discovery field. They are applied successfully to the following categories:  function approximation ,  regression analysis ,  time series prediction ,  classification ,  data clustering  and  data filtering . The parallel structure of neural networks makes them suitable for  online learning  within  real time constraints . Simple and cheap units, that enables us to produce relatively cheap parallel processor that consists of massive number of units.
However, Ann is complex to be understandable ANNs – that consist of a large number of weighted connections and activation units – often generate  incomprehensible  and  hard-to-understand  models. When feeding an input to a trained neural network, the user is unable to infer how a particular output is obtained. T here is  no  possibility to come up with a  meaningful interpretation  of the network parameters and its response to a specific input . T he knowledge acquired by neural networks is considered incomprehensible and not transferable to other knowledge representation schemes such as expert or rule-based systems
Why querying comprehensibility ? Easy to be modeled and understood. Easy to be maintained and supported. Easy to be developed and enhanced. Easy to be transferred to other knowledge representations such as expert rule-based systems.
Comprehensible Neural Network Trees  (CNNTrees)
Comprehensibility and symbolic representation Comprehensible systems could be modeled and described using  symbolic representations . Symbolic formulas are one of the most oldest and popular symbolic representations. Symbolic formulas are decomposed to  data symbols  and  operations symbols . Operations symbols are decomposed to agreed conventional operators like  AND ,  OR , and  NOT  in Boolean logic formulas. Symbolic formulas could be represented using  TREE  structure.
Comprehensible Expert Neural network (CENN) In the  XOR  example, we selected a simple neural network to be trained for every conventional operation  AND ,  OR , and  NOT . This simple neural network that corresponds to the agreed conventional operations are called Comprehensible Expert Neural Network ( CENN ). The collection of CENNs that is used for building the tree is called Comprehensible Expert Neural Network Library ( CENNL ). Comprehensible Expert Neural Network Library CENNL
Comprehensible Neural network tree (CNNTree) The comprehensible neural network tree is analogue for symbolic formula. Where operators are substituted by  CENNs  Data symbols are the inputs for tree leafs.  The formula output is the tree  root  output.
Designing the CNNTree
Design constraints In this work we used the following design: Fixed tree structure. Full binary tree structure.
Design constraints (CONT.) In this work we used the following design: Single perceptron with two inputs as a CENN unified structure. CENN library contains CENNs for Boolean logic operators. The used operators in the CENNL are  AND ,  OR ,  NOT1 , and  NOT2 ,  where NOTi complements the ith input . In this work, additional CENN items are introduced,  TRANSFERE1  and  TRANSFERE2  which transfer the corresponding input directly to the output, without changing it .
Design constraints (CONT.) CNNTREE with depth: 3 layers Training set vectors x0 x1 x2 d0 … .. … . … . … . … .. … . … . … . … .. … . … . … . … .. … . … . … . … .. … . … . … . … .. … . … . … . … .. … . … . … . … .. … . … . … . … .. … . … . … .
Designing  CNNTREE The design problem could be summarized in finding the following: CENNs from CENN library for each node in the CNNTREE. Input from problem inputs for each leaf node input. The configuration of CNNTREE nodes and inputs should  minimize the error . To solve this problem,  Design by Neural Networks methods are tried: Supervised Method Unsupervised Method A  simple genetic algorithm  (SGA)  is adopted .
Design using Genetic Algorithms The used genetic algorithm  has three operators : R oulette wheel selection O ne-point crossover Byte  mutation The genotype of CNNTREE is the concatenation of : CENNs indices in the  CENNs  library A ssigned inputs indices Each index is represented by 8-bits binary number , where
Experiments & Results
Digits recognition An experiment of recognizing numerical character patterns is illustrated here Ten numerical character patterns are used  [0-9] . Each pattern consists of 64  (8 x 8)  pixels black and white.
Genetic Algorithm Parameters Mutation probability is 0.05 Crossover probability is 0.8 Population size is 20 chromosomes The genetic algorithm is programmed to stop when it finds a tree with fitness value more than 10
Correctness experiment An experiment is done on evolved  CNNTREE  so as to measure its correctness by generating random noisy pixels for the patterns. For every pattern,  from 1 to 9 random pixels  are selected and their values are inverted. At every number of noisy pixels, the experiment is done  100  times and the count of correctly classified patterns is measured. The experiment is performed for different CNNTREEs with  depths from 7 to 9 layers . The same experiments are conducted using  single hidden layer feed forward neural network  so as to know how much these results are good.
Correctness  results using CNNTREE Correctness ratio of CNNTREEs at error threshold 0.1, 0.2, 0.3, and 0.4
Correctness results using FF ANN Correctness ratio of feed forward neural network at different threshold values
Correctness results Feed-forward neural network CNNTREE
Performance experiments The number of generations and the generation times are measured when designing a  CNNTREE  using the proposed genetic algorithm. The average overall time for training the CNNTREE versus tree depth is plotted. Experiment for measuring the back-propagation time is done.  The learning time of the patterns is measured, and the average time for learning a pattern is calculated and plotted. Both experiments are repeated  10 times  for every tree depth from  6 to 10 layers .
Performance Results for designing CNNTREE Average total genetic training time versus CNNTREE depth,  done on Intel Pentium 1.70 GHz processor.
Performance Results for on-line LOCAL training Average back-propagation training epoch time versus network tree depth
Supervised method Try to know the appropriate configuration  for certain neural network using previously known configurations  for other problems Unsupervised methods Use Train – Classify – Modify iterations until find suitable configuration Designing CNNTREEs using ANNs
Supervised method
Unsupervised method
Unsupervised method Training using Back Propagation Classification is based upon the Euclidian distance of Weights from primitives weights Outputs from primitives outputs Modification is based on selecting the most far N classified primitives and  randomizing them, and this is the weak point of this approach.
CONCLUSION A new type of modular neural networks – comprehensible neural network tree (CNNTREE) – is introduced. Experimental results with a digit recognition problem show that CNNTREE generalization is comparable for single hidden layer feed forward neural networks generalization, while CNNTREE is constructed to be  easily interpreted for symbolic systems .
Thank you for Your attention

More Related Content

PDF
Compegence: Dr. Rajaram Kudli - An Introduction to Artificial Neural Network ...
PPS
Neural Networks
PDF
Artificial Neural Network report
PDF
Artificial Neural Network Abstract
PPT
NEURAL NETWORKS
PDF
Neural networks
PPTX
Artificial intelligence NEURAL NETWORKS
PDF
Artificial Neural Network and its Applications
Compegence: Dr. Rajaram Kudli - An Introduction to Artificial Neural Network ...
Neural Networks
Artificial Neural Network report
Artificial Neural Network Abstract
NEURAL NETWORKS
Neural networks
Artificial intelligence NEURAL NETWORKS
Artificial Neural Network and its Applications

What's hot (20)

PPTX
Neural networks.ppt
PPTX
PPT
Ai and neural networks
PPT
neural networks
PPTX
Neural network
PDF
Artificial neural networks and its application
PPTX
Artificial nueral network slideshare
PPTX
neural networks
PPTX
Artificial Neural Network
PPT
Neutral Network
DOC
Neural network
PDF
Tamil Character Recognition based on Back Propagation Neural Networks
PPTX
Artificial Neural Network Topology
PPTX
Project presentation
PDF
Neural networks introduction
PDF
Handwritten digits recognition report
PPT
Neural Networks
PDF
Neural network
PPTX
VLSI IN NEURAL NETWORKS
Neural networks.ppt
Ai and neural networks
neural networks
Neural network
Artificial neural networks and its application
Artificial nueral network slideshare
neural networks
Artificial Neural Network
Neutral Network
Neural network
Tamil Character Recognition based on Back Propagation Neural Networks
Artificial Neural Network Topology
Project presentation
Neural networks introduction
Handwritten digits recognition report
Neural Networks
Neural network
VLSI IN NEURAL NETWORKS
Ad

Viewers also liked (18)

DOC
21.12上越市議会 総括質疑
PPTX
Referentieprojecten Kupan B.V.
PPTX
Application and Website Security -- Fundamental Edition
PDF
Milford Corridor Study
PPTX
Turkcell Financial Position
DOC
12月定例会 副市長人事案件 総括質疑
PDF
Cognitive Architectures - Amr Kamel - 2015
PDF
Keene Historic Preservation
PPTX
Application and Website Security -- Designer Edition: Using Formal Specificat...
DOC
副市長人事案件 総括質疑
PPS
Beautiful Pictures 2 Music Hollies
PPTX
Big data analytics
PPTX
Application and Website Security -- Developer Edition: Introducing Security I...
PPTX
Cloud computing
PPTX
Cognitive Architectures - Research Circle
PPTX
Quantum computing
PPTX
Mahfazty mobile payment in egypt
PPS
TURKCELL CASE STUDY
21.12上越市議会 総括質疑
Referentieprojecten Kupan B.V.
Application and Website Security -- Fundamental Edition
Milford Corridor Study
Turkcell Financial Position
12月定例会 副市長人事案件 総括質疑
Cognitive Architectures - Amr Kamel - 2015
Keene Historic Preservation
Application and Website Security -- Designer Edition: Using Formal Specificat...
副市長人事案件 総括質疑
Beautiful Pictures 2 Music Hollies
Big data analytics
Application and Website Security -- Developer Edition: Introducing Security I...
Cloud computing
Cognitive Architectures - Research Circle
Quantum computing
Mahfazty mobile payment in egypt
TURKCELL CASE STUDY
Ad

Similar to Evolving Comprehensible Neural Network Trees (20)

PDF
Methods of Combining Neural Networks and Genetic Algorithms
PDF
X-TREPAN: A MULTI CLASS REGRESSION AND ADAPTED EXTRACTION OF COMPREHENSIBLE D...
PDF
X-TREPAN : A Multi Class Regression and Adapted Extraction of Comprehensible ...
PDF
Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...
PDF
X trepan an extended trepan for
PDF
Top 10 neural networks
PPT
Machine Learning
PDF
Artificial Intelligence Chapter 9 Negnevitsky
PDF
G44083642
PDF
Swarm assignment 1
PPTX
2. NEURAL NETWORKS USING GENETIC ALGORITHMS.pptx
PDF
Modeling of neural image compression using gradient decent technology
PDF
Machine Learning
PDF
Artificial Intelligence Applications in Petroleum Engineering - Part I
PPTX
PDF
2013-1 Machine Learning Lecture 07 - Michael Negnevitsky - Hybrid Intellig…
PDF
IRJET-Performance Enhancement in Machine Learning System using Hybrid Bee Col...
PDF
bbbPaper
PPT
Machine Learning, Data Mining, Genetic Algorithms, Neural ...
PDF
[IJET V2I2P20] Authors: Dr. Sanjeev S Sannakki, Ms.Anjanabhargavi A Kulkarni
Methods of Combining Neural Networks and Genetic Algorithms
X-TREPAN: A MULTI CLASS REGRESSION AND ADAPTED EXTRACTION OF COMPREHENSIBLE D...
X-TREPAN : A Multi Class Regression and Adapted Extraction of Comprehensible ...
Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...
X trepan an extended trepan for
Top 10 neural networks
Machine Learning
Artificial Intelligence Chapter 9 Negnevitsky
G44083642
Swarm assignment 1
2. NEURAL NETWORKS USING GENETIC ALGORITHMS.pptx
Modeling of neural image compression using gradient decent technology
Machine Learning
Artificial Intelligence Applications in Petroleum Engineering - Part I
2013-1 Machine Learning Lecture 07 - Michael Negnevitsky - Hybrid Intellig…
IRJET-Performance Enhancement in Machine Learning System using Hybrid Bee Col...
bbbPaper
Machine Learning, Data Mining, Genetic Algorithms, Neural ...
[IJET V2I2P20] Authors: Dr. Sanjeev S Sannakki, Ms.Anjanabhargavi A Kulkarni

Recently uploaded (20)

PDF
Encapsulation theory and applications.pdf
PDF
Dropbox Q2 2025 Financial Results & Investor Presentation
PDF
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
PDF
Chapter 3 Spatial Domain Image Processing.pdf
PPTX
Cloud computing and distributed systems.
PPT
“AI and Expert System Decision Support & Business Intelligence Systems”
PDF
Agricultural_Statistics_at_a_Glance_2022_0.pdf
PPTX
ACSFv1EN-58255 AWS Academy Cloud Security Foundations.pptx
DOCX
The AUB Centre for AI in Media Proposal.docx
PDF
Review of recent advances in non-invasive hemoglobin estimation
PDF
Approach and Philosophy of On baking technology
PDF
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
PDF
Advanced methodologies resolving dimensionality complications for autism neur...
PDF
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
PDF
NewMind AI Weekly Chronicles - August'25 Week I
PPTX
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
PDF
Spectral efficient network and resource selection model in 5G networks
PDF
The Rise and Fall of 3GPP – Time for a Sabbatical?
PDF
KodekX | Application Modernization Development
PPTX
20250228 LYD VKU AI Blended-Learning.pptx
Encapsulation theory and applications.pdf
Dropbox Q2 2025 Financial Results & Investor Presentation
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
Chapter 3 Spatial Domain Image Processing.pdf
Cloud computing and distributed systems.
“AI and Expert System Decision Support & Business Intelligence Systems”
Agricultural_Statistics_at_a_Glance_2022_0.pdf
ACSFv1EN-58255 AWS Academy Cloud Security Foundations.pptx
The AUB Centre for AI in Media Proposal.docx
Review of recent advances in non-invasive hemoglobin estimation
Approach and Philosophy of On baking technology
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
Advanced methodologies resolving dimensionality complications for autism neur...
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
NewMind AI Weekly Chronicles - August'25 Week I
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
Spectral efficient network and resource selection model in 5G networks
The Rise and Fall of 3GPP – Time for a Sabbatical?
KodekX | Application Modernization Development
20250228 LYD VKU AI Blended-Learning.pptx

Evolving Comprehensible Neural Network Trees

  • 1. Constructing Comprehensible Neural Networks using Genetic Algorithms By Amr Kamel Ahmed El-Sayed Helwan University, Faculty of Engnieering, Department of Communications, Electronics, and Computers بناء شبكات عصابية سهلة التفسير باستخدام الخواريزمات الجينية Master Thesis Under Supervision of: Prof: Sayed Mostafa Saad Dr: Alaa Hamdy
  • 2. PRIA-8-2007 8th International Conference on PATTERN RECOGNITION and IMAGE ANALYSIS: NEW INFORMATION TECHNOLOGIES. INTERNATIONAL ASSOCIATION FOR PATTERN RECOGNITION (IAPR). Paper Title “ EVOLVING COMPREHENSIBLE NEURAL NETWORK TREES USING GENETIC ALGORITHMS ”. Conference proceeding Volume 1, pages [186 - 190]. Accepted for publish in international journal. Would be published in NO 4. Vol.18. English papers are published in Springer. Paper Title “ EVOLVING COMPREHENSIBLE NEURAL NETWORK TREES USING GENETIC ALGORITHMS ”. Pattern Recognition and Image Analysis. Advances in Mathematical Theory and Applications
  • 3. Constructing Comprehensible Neural Networks using Genetic Algorithms This presentation demonstrates a new general neural network structure - comprehensible neural network tree ( CNNTREE ). An evolutionary algorithm is also presented for designing the CNNTREE. Results of experiments to test the generalization, the evolutionary algorithm performance and the back propagation learning performance.
  • 5. ANN solves various classes of problems ANNs are well suitable to solve various classes of problems in knowledge discovery field. They are applied successfully to the following categories: function approximation , regression analysis , time series prediction , classification , data clustering and data filtering . The parallel structure of neural networks makes them suitable for online learning within real time constraints . Simple and cheap units, that enables us to produce relatively cheap parallel processor that consists of massive number of units.
  • 6. However, Ann is complex to be understandable ANNs – that consist of a large number of weighted connections and activation units – often generate incomprehensible and hard-to-understand models. When feeding an input to a trained neural network, the user is unable to infer how a particular output is obtained. T here is no possibility to come up with a meaningful interpretation of the network parameters and its response to a specific input . T he knowledge acquired by neural networks is considered incomprehensible and not transferable to other knowledge representation schemes such as expert or rule-based systems
  • 7. Why querying comprehensibility ? Easy to be modeled and understood. Easy to be maintained and supported. Easy to be developed and enhanced. Easy to be transferred to other knowledge representations such as expert rule-based systems.
  • 8. Comprehensible Neural Network Trees (CNNTrees)
  • 9. Comprehensibility and symbolic representation Comprehensible systems could be modeled and described using symbolic representations . Symbolic formulas are one of the most oldest and popular symbolic representations. Symbolic formulas are decomposed to data symbols and operations symbols . Operations symbols are decomposed to agreed conventional operators like AND , OR , and NOT in Boolean logic formulas. Symbolic formulas could be represented using TREE structure.
  • 10. Comprehensible Expert Neural network (CENN) In the XOR example, we selected a simple neural network to be trained for every conventional operation AND , OR , and NOT . This simple neural network that corresponds to the agreed conventional operations are called Comprehensible Expert Neural Network ( CENN ). The collection of CENNs that is used for building the tree is called Comprehensible Expert Neural Network Library ( CENNL ). Comprehensible Expert Neural Network Library CENNL
  • 11. Comprehensible Neural network tree (CNNTree) The comprehensible neural network tree is analogue for symbolic formula. Where operators are substituted by CENNs Data symbols are the inputs for tree leafs. The formula output is the tree root output.
  • 13. Design constraints In this work we used the following design: Fixed tree structure. Full binary tree structure.
  • 14. Design constraints (CONT.) In this work we used the following design: Single perceptron with two inputs as a CENN unified structure. CENN library contains CENNs for Boolean logic operators. The used operators in the CENNL are AND , OR , NOT1 , and NOT2 , where NOTi complements the ith input . In this work, additional CENN items are introduced, TRANSFERE1 and TRANSFERE2 which transfer the corresponding input directly to the output, without changing it .
  • 15. Design constraints (CONT.) CNNTREE with depth: 3 layers Training set vectors x0 x1 x2 d0 … .. … . … . … . … .. … . … . … . … .. … . … . … . … .. … . … . … . … .. … . … . … . … .. … . … . … . … .. … . … . … . … .. … . … . … . … .. … . … . … .
  • 16. Designing CNNTREE The design problem could be summarized in finding the following: CENNs from CENN library for each node in the CNNTREE. Input from problem inputs for each leaf node input. The configuration of CNNTREE nodes and inputs should minimize the error . To solve this problem, Design by Neural Networks methods are tried: Supervised Method Unsupervised Method A simple genetic algorithm (SGA) is adopted .
  • 17. Design using Genetic Algorithms The used genetic algorithm has three operators : R oulette wheel selection O ne-point crossover Byte mutation The genotype of CNNTREE is the concatenation of : CENNs indices in the CENNs library A ssigned inputs indices Each index is represented by 8-bits binary number , where
  • 19. Digits recognition An experiment of recognizing numerical character patterns is illustrated here Ten numerical character patterns are used [0-9] . Each pattern consists of 64 (8 x 8) pixels black and white.
  • 20. Genetic Algorithm Parameters Mutation probability is 0.05 Crossover probability is 0.8 Population size is 20 chromosomes The genetic algorithm is programmed to stop when it finds a tree with fitness value more than 10
  • 21. Correctness experiment An experiment is done on evolved CNNTREE so as to measure its correctness by generating random noisy pixels for the patterns. For every pattern, from 1 to 9 random pixels are selected and their values are inverted. At every number of noisy pixels, the experiment is done 100 times and the count of correctly classified patterns is measured. The experiment is performed for different CNNTREEs with depths from 7 to 9 layers . The same experiments are conducted using single hidden layer feed forward neural network so as to know how much these results are good.
  • 22. Correctness results using CNNTREE Correctness ratio of CNNTREEs at error threshold 0.1, 0.2, 0.3, and 0.4
  • 23. Correctness results using FF ANN Correctness ratio of feed forward neural network at different threshold values
  • 24. Correctness results Feed-forward neural network CNNTREE
  • 25. Performance experiments The number of generations and the generation times are measured when designing a CNNTREE using the proposed genetic algorithm. The average overall time for training the CNNTREE versus tree depth is plotted. Experiment for measuring the back-propagation time is done. The learning time of the patterns is measured, and the average time for learning a pattern is calculated and plotted. Both experiments are repeated 10 times for every tree depth from 6 to 10 layers .
  • 26. Performance Results for designing CNNTREE Average total genetic training time versus CNNTREE depth, done on Intel Pentium 1.70 GHz processor.
  • 27. Performance Results for on-line LOCAL training Average back-propagation training epoch time versus network tree depth
  • 28. Supervised method Try to know the appropriate configuration for certain neural network using previously known configurations for other problems Unsupervised methods Use Train – Classify – Modify iterations until find suitable configuration Designing CNNTREEs using ANNs
  • 31. Unsupervised method Training using Back Propagation Classification is based upon the Euclidian distance of Weights from primitives weights Outputs from primitives outputs Modification is based on selecting the most far N classified primitives and randomizing them, and this is the weak point of this approach.
  • 32. CONCLUSION A new type of modular neural networks – comprehensible neural network tree (CNNTREE) – is introduced. Experimental results with a digit recognition problem show that CNNTREE generalization is comparable for single hidden layer feed forward neural networks generalization, while CNNTREE is constructed to be easily interpreted for symbolic systems .
  • 33. Thank you for Your attention