SlideShare a Scribd company logo
Evolving Neural Networks
through
Augmenting Topologies
Authors
Kenneth O. Stanley
Risto Miikkulainen
Presenters
Navin Adhikari
Aavaas Gajurel
2
Overview
 Introduction
 Background
 NeuroEvolution of Augmenting Topologies (NEAT)
 Evaluating NEAT
 Analysis
 Discussion
 and Applications.
3
Introduction
 Artificial evolution of Neural Network using Genetic Algorithm
 Evolving what?
 Evolving weights in fixed topology
• The weight space is explored through the crossover of network weight vectors and
through the mutation of single network weight
• The goal of fixed-topology NE is to optimize the connection weights that
determine the functionality of a network
 Topology and Weight Evolving Artificial Neural Networks (TWEANNs)
• Not only the connection weights but also the topology of NN affects their
functionality
 Evolving structure incrementally presents several technical challenges
 How to represent disparate topologies to cross over in a meaningful way?
 How can topological innovation that needs a few generations to be optimized be
protected so that it does not disappear from the population prematurely?
 How can topologies be minimized throughout evolution without the need for a specially
contrived fitness function that measures complexity?
NeuroEvolution
4
Background
 How to encode a neural network?
 Direct
 Genome specifies explicitly the phenotype
 Many encoding techniques purposed from binary encoding to graphic encoding
 Binary Encoding
 A bit string represents the connective matrix of networks
 Advantages: Evolution can be performed with a standard GA
 Drawbacks: Computational space is square with respect to nodes
 Graph Encoding
 The number of nodes in the two-dimensional grid that represents the
graph version of the genome
 Indirect
 Genome specifies how to build a network
 Allows a more compact representation than direct encoding, because every
connection and node are not specified in the genome, although they can be derived
from it
Issues with TWEANNS
5
Background
 Mating(Crossover)
 Competing Conventions
 The competing conventions problem arises when there is more than one way of
representing information in a phenotype
 Two neural networks: both encode the same function
 No matter how they are crossed their children will lack information
 Can be solved using historical marking
Issues with TWEANNS
6
Background
 Protecting Innovations
 The Innovations usually involves with larger network with more connections
 Adding a new connection can reduce fitness before its weight has a chance to
optimize
 Some generations are required to optimize the new structure and make use of it
 Larger networks require more time to be optimized and can not compete with
smaller ones
 Due to initial loss of fitness caused by the new structure, the innovation is unlikely
to survive in the population long enough to be optimized
 Speciation is an effective technique to avoid competition and mating of networks
too different
Issues with TWEANNS
7
Background
 Initial Populations and Topological Innovation
 Initial populations of random networks to ensure topological diversity from the
start
 Drawbacks
 under many of the direct encoding schemes, there is a chance that a
network will have no path from each of its inputs to its outputs and take
time to weed out of the population
 Starting out with random topologies does not lead to finding desirable
minimal solutions since the population starts out with many unnecessary
nodes and connections already present
 Solutions
 Fitness penalty: difficult to know how large the penalty should be for
any particular network size, particularly because different problems may
have significantly different topological requirements
 starting out with minimal population i.e. with no hidden nodes and
grows structure only as it benefits the solution
Issues with TWEANNS
8
NEAT
 NeuroEvolutoon of Augmenting Topology can solve effectively the
TWENN issues
 Tracking Genes through Historical Markings
 Protecting Innovation through Speciation
 Minimizing Dimensionality through Incremental Growth from Minimal Structure
9
NEAT
Genetic Encoding
10
NEAT
 Mutation in NEAT can change both connection weights and
network structures
Mutation
11
NEAT
Genetic Encoding
12
NEAT
 Two genes with same history are expected to be homologous
 All a system needs to do to know which genes line up with which is to
keep track of the historical origin of every gene in the system
 Keep a global counter; every time a neuron or synapse is added, assign
the value of the counter, and increment it
 When crossing over, the genes in both genomes with the same innovation
numbers are lined up
Historical Marking
13
NEAT
Historical Marking Take two parent
NNs:
• Line up connection genes according
to their historical markings
• For matching genes, copy either gene
into child at random
• Disjoint genes (those in the middle
without a partner gene) Excess genes
(those at the end)
14
NEAT
Protecting Innovation through Speciation
In order to let survive useful innovations, they have to be
protected
NEAT solve this problem with Speciation
•Similar networks are clustered
•Competition and mating is restricted with same space
•Fitness sharing prevents from a single species taking over
the population
15
NEAT
Protecting Innovation through Speciation
A Compatibility distance is computed on the basis of the
excess(E), disjoint(D) and the average weight distance W of
matching
16
NEAT
Minimizing Dimensionality through Incremental
Growth from Minimal Structure
NEAT biases the search towards minimal-dimensional spaces
by starting out with a uniform population of networks with
zero hidden nodes (i.e., all inputs connect directly to outputs)
New structure is introduced incrementally as structural
mutations occur
Minimizing dimensionality gives NEAT a performance
advantage compared to other approaches, as will be discussed
next section
 Aavaas
17
Evaluating NEAT Performance
 1)Can NEAT evolve the necessary structures?
 2)Can it find solutions more efficiently?
 For 1) build a XOR network
 For 2) build a network for pole balancing case
With velocity DPV
Without velocity DPNV (difficult)
18
Parameter Settings NEAT/GA
 Some parameters are sensitive to population size
 All
 Population = 120
 C1= 1.0, c2 =1.0, c3 = 0.4
 Dt = 3.0
 For DPNV
 Population = 1000
 C3 = 0.3 (allow for finer distinctions between species)
 Dt = 4.0 (make room for larger c3)
19
Parameter Settings NEAT/GA
 Pmutation(weight) = 80%
 90% small perturbation
 10% new random value
 Pmutation (new node) = 0.03
 Pmutation (new link) = 0.05 (0.3 for DPNV)
 High mutation is possible due to speciation
 25% of offspring created with no crossover
 Discard Species whose max fitness didn’t increase for 15 generations
 Champion of species with Population > 5 was copied
 Interspecies mating 0.001
 If node Disabled in parent -> 75% of time disabled in child
 All parameter values were found experimentally
 Links need to be added more often than nodes
20
Evaluation: Evolving XOR
 Simple test to Verify NEAT’s ability
 XOR is not linearly separable
 Needs a hidden layer
 Fitness = (4 –Sumi(correct – output))2
 At start no hidden units, 2 input, 1 bias , 1 output
 Random connection weights
 Found solutions in all of 100 runs
21
Evaluation: Evolving XOR
 Optimal solution was found in 22 of 100 runs
 Avg generation to solution = 32
 Avg hidden nodes = 2.35
 Non disabled connections = 7.48
 NEAT solves XOR without trouble and keeps the
topology small while doing so.
22
Pole Balancing
 https://www.youtube.com/watch?v=ERYlXxjGb6E
 Known benchmark, and is a real task
 Apply force to the cart to keep the
poles balanced for as long as possible
 State with DPV and without velocity DPNV(hard)
 Cart position, cart velocity,
 (pole angle, angular velocity) * 2poles
 Simulated with step 0.01 seconds
 all state variables scaled to [-1,1] before feeding to network
 Network output every 0.02 seconds.
 Long pole(0.1m) starts with 1o
angle and Short pole(1.0m) stands straight
 NeuroEvolutionary approach have outperformed RL in prior comparisons.
23
Double Pole Balancing with velocity
 Success if both poles balanced for 100,000 steps
 Pole is balanced if between -36 to 36 degrees from vertical
 Fitness = no of timesteps the both poles were balanced
 Table: Neat averaged over 120 runs, others over 50 runs
 SANE fixed with neurons and network blueprints
 ESP = SANE + separate populations for different hidden layer configuration
 ESP was the reigning champion
 NEAT takes fewest evaluations to complete the task
 NEAT solutions used between 0-4 hidden nodes vs ~10 for ESP
24
Double Pole Balancing without Velocities(hard)
 Used the special fitness function to prevent the system from
solving the problem by simply wiggling the cart rapidly.
 This forces the system to internally compute the hidden state
i.e. velocity.
 In addition to balancing poles for 100,000 steps,
 we also measure its generalization on 625 different initial
states for 1000 time steps.
 Champion is selected as a representative.
 To count as a solution, it must be able to generalize on 200
out of 625 states.
 5^4 = 625 ( the permutation of 0.05, 0.25,0.5, 0.75 on 4 initial
state variable configurations)
25
Comparison DPNV
 NEAT is the fastest and much faster than ESP (5 times)
 ESP needed to re initialize on avg of 4.06 times due to being
stuck
 Characterized as NEAT being more reliable at avoiding
deception as it is trying different approaches.
 Now, we have established that NEAT is a robust
technique
26
Importance of each component of NEAT
 Have argued that NEAT’s performance is due to
 Historical markings,
 Speciation,
 Growth from minimal structure
 Do we need all three?
 E.g. it is possible that Historical markings and
Speciation is sufficient for performance.
 We perform series of ablations(surgical removal) of
each component to gauge their contribution.
27
Ablations
 Used DPV (with velocities) as it is simpler. And we can expect crippled
NEAT to find solutions for it.
 We cannot remove Historical markings as that will lead to conventional
NE
 And Markings are basis for every function of NEAT. (Speciation,
crossover)
 Avg over 20 runs (except 120 for Non mating as it was fast to evaluate)
 System performs significantly worse for every ablation.
28
Ablation
 No-Growth :
 Would mean NEAT will not be able to generate hidden layer
 Thus allowed to start with same fully connected hidden layer of 10 nodes
 NEAT was able to speciate but only based on weight differences.
 Initial Random population:
 Each individual is initialized with random genome
 Was 7 times slower
 Suggesting it had to search higher dimensional spaces than required, wasting time
 Nonspeciated:
 No structural innovations can survive. Thus networks get stuck in minimal form
 7 times slow as without speciation, the population quickly converges on whatever
topology happens to initially perform best.
 NonMating:
 It took on average more evaluations to
find a solution than with the mating on.
 States that crossover is useful.
29
Ablation Conclusion
 All parts of NEAT work together to
result its performance.
30
Visualizing speciation over time
31
Most of the species that did not come close to a solution survived to end.
Winning species was 11 generations old and didn’t take over the population.
Insights
 NEAT shows good capacity to find new structures and is efficient in
difficult control tasks
 If the amount of structure can be minimized thought evolution, the search
space to be explored is greatly reduced leading to significant gains.
 Parallel drawn between Incremental Evolution where system is trained on
simpler tasks first and then using that that expertise to do better while
training on harder tasks.
 Solving the easier version of the task places it on the ballpark of space
closer to the solution of the harder task.
 Adding structure is analogues to building upon the simpler task.
32
Insights
 NEAT is not necessarily trapped even if the current network
represent a local optima as it can always add more structure.
 NEAT strengthens Analogy between Gas and natural
evolution as it not only allows optimization of the solution.
 It also performs complexification allowing solutions
to become incrementally more complex.
33
Hypothesis: Use in competitive
coevolution
 Authors claim once the coevolution converges onto a
dominant strategy, it takes the entire set of values to
represent the strategy.
 If a new strategy is to take hold, it must be different rather
than being more sophisticated on the same representation.
 Thus using NEAT can allow for continual competitive
coevolution and as it allows for new structure leading to new
expressive space for elaboration.
 Different strategies are protected, thus there will be multiple
dominant strategies.
34
Hypothesis: Using NEAT to
integrate networks
 Two networks with different specialty.
 Good at Dribble vs Kicking in soccer.
 In order to combine them optimally, the hidden
nodes of two networks must share information so
that the skills are combined effectively.
 NEAT can search for right interconnections between
two distinct networks.
 And create Integrated supernetwork that takes
advantage of the expertise of both components.
35
Fin
36

More Related Content

PPTX
Rtos concepts
PPTX
Genetic programming
PPTX
Artificial neural network
DOC
Neural network and fuzzy logic
PPT
presentation on real time operating system(RTOS's)
PPT
Introduction to Real-Time Operating Systems
PDF
Building Neural Network Through Neuroevolution
PPTX
Cuda Architecture
Rtos concepts
Genetic programming
Artificial neural network
Neural network and fuzzy logic
presentation on real time operating system(RTOS's)
Introduction to Real-Time Operating Systems
Building Neural Network Through Neuroevolution
Cuda Architecture

What's hot (20)

PPTX
Amoeba distributed operating System
PPTX
Genetic Algorithm by Example
PPT
Neural network final NWU 4.3 Graphics Course
PPTX
Neural network
PPTX
[Mmlab seminar 2016] deep learning for human pose estimation
DOCX
Levels of Virtualization.docx
PPTX
Embedded Software Development
PPTX
Introduction to Thread Level Parallelism
PPT
Unit 4
PPTX
Distributed computing
PDF
Hybrid Systems using Fuzzy, NN and GA (Soft Computing)
PPTX
14_cnn complete.pptx
PPTX
Genetic algorithm
PPTX
EfficientNet
PPTX
Structure of processes ppt
PPTX
High performance computing for research
PPTX
Recurrent Neural Networks (RNNs)
PPTX
Normalization 방법
PDF
Deep learning seminar report
PDF
Amoeba distributed operating System
Genetic Algorithm by Example
Neural network final NWU 4.3 Graphics Course
Neural network
[Mmlab seminar 2016] deep learning for human pose estimation
Levels of Virtualization.docx
Embedded Software Development
Introduction to Thread Level Parallelism
Unit 4
Distributed computing
Hybrid Systems using Fuzzy, NN and GA (Soft Computing)
14_cnn complete.pptx
Genetic algorithm
EfficientNet
Structure of processes ppt
High performance computing for research
Recurrent Neural Networks (RNNs)
Normalization 방법
Deep learning seminar report
Ad

Similar to Evolving Neural Networks through Augmenting Topologies NEAT (20)

PPT
Evolving Neural Networks Through Augmenting Topologies
PDF
tankala srinivas, palasa
PPTX
Long and short term memory presesntation
PPTX
Sachpazis: Demystifying Neural Networks: A Comprehensive Guide
PPTX
Wireless Sensor Networks LEACH & EDEEC
PDF
H017376369
PDF
A New Classifier Based onRecurrent Neural Network Using Multiple Binary-Outpu...
PPT
Neural Networks in Data Mining - “An Overview”
PDF
Energy aware clustering protocol (eacp)
PPTX
Basic Learning Algorithms of ANN
PPTX
Ann model and its application
PPTX
08 neural networks
PDF
A PROPOSAL TO IMPROVE SEP ROUTING PROTOCOL USING INSENSITIVE FUZZY C-MEANS IN...
PDF
Improving Hardware Efficiency for DNN Applications
PDF
International Journal of Computational Science, Information Technology and Co...
PDF
6119ijcsitce01
PDF
CONTRAST OF RESNET AND DENSENET BASED ON THE RECOGNITION OF SIMPLE FRUIT DATA...
PDF
CONTRAST OF RESNET AND DENSENET BASED ON THE RECOGNITION OF SIMPLE FRUIT DATA...
PPT
Protocols for wireless sensor networks
PPTX
Classification by backpropacation
Evolving Neural Networks Through Augmenting Topologies
tankala srinivas, palasa
Long and short term memory presesntation
Sachpazis: Demystifying Neural Networks: A Comprehensive Guide
Wireless Sensor Networks LEACH & EDEEC
H017376369
A New Classifier Based onRecurrent Neural Network Using Multiple Binary-Outpu...
Neural Networks in Data Mining - “An Overview”
Energy aware clustering protocol (eacp)
Basic Learning Algorithms of ANN
Ann model and its application
08 neural networks
A PROPOSAL TO IMPROVE SEP ROUTING PROTOCOL USING INSENSITIVE FUZZY C-MEANS IN...
Improving Hardware Efficiency for DNN Applications
International Journal of Computational Science, Information Technology and Co...
6119ijcsitce01
CONTRAST OF RESNET AND DENSENET BASED ON THE RECOGNITION OF SIMPLE FRUIT DATA...
CONTRAST OF RESNET AND DENSENET BASED ON THE RECOGNITION OF SIMPLE FRUIT DATA...
Protocols for wireless sensor networks
Classification by backpropacation
Ad

More from Aavaas Gajurel (6)

PPTX
Search Engine Project Presentation
PPTX
Microsoft Student Partner Year Review
PPTX
Nepali Speech Recognition
PPTX
Midnight Hackers
PPTX
Html5 aavaas gajurel techmela
PPTX
Dreamspark
Search Engine Project Presentation
Microsoft Student Partner Year Review
Nepali Speech Recognition
Midnight Hackers
Html5 aavaas gajurel techmela
Dreamspark

Recently uploaded (20)

DOCX
The AUB Centre for AI in Media Proposal.docx
PPTX
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
PDF
Diabetes mellitus diagnosis method based random forest with bat algorithm
PPTX
Programs and apps: productivity, graphics, security and other tools
PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
PPT
“AI and Expert System Decision Support & Business Intelligence Systems”
PPTX
Cloud computing and distributed systems.
PDF
Machine learning based COVID-19 study performance prediction
PDF
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
PDF
Network Security Unit 5.pdf for BCA BBA.
PDF
Dropbox Q2 2025 Financial Results & Investor Presentation
PPTX
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
PPTX
Spectroscopy.pptx food analysis technology
PDF
Spectral efficient network and resource selection model in 5G networks
PDF
The Rise and Fall of 3GPP – Time for a Sabbatical?
PDF
Encapsulation_ Review paper, used for researhc scholars
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PDF
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
PDF
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
PPTX
sap open course for s4hana steps from ECC to s4
The AUB Centre for AI in Media Proposal.docx
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
Diabetes mellitus diagnosis method based random forest with bat algorithm
Programs and apps: productivity, graphics, security and other tools
Reach Out and Touch Someone: Haptics and Empathic Computing
“AI and Expert System Decision Support & Business Intelligence Systems”
Cloud computing and distributed systems.
Machine learning based COVID-19 study performance prediction
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
Network Security Unit 5.pdf for BCA BBA.
Dropbox Q2 2025 Financial Results & Investor Presentation
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
Spectroscopy.pptx food analysis technology
Spectral efficient network and resource selection model in 5G networks
The Rise and Fall of 3GPP – Time for a Sabbatical?
Encapsulation_ Review paper, used for researhc scholars
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
sap open course for s4hana steps from ECC to s4

Evolving Neural Networks through Augmenting Topologies NEAT

  • 1. Evolving Neural Networks through Augmenting Topologies Authors Kenneth O. Stanley Risto Miikkulainen Presenters Navin Adhikari Aavaas Gajurel
  • 2. 2 Overview  Introduction  Background  NeuroEvolution of Augmenting Topologies (NEAT)  Evaluating NEAT  Analysis  Discussion  and Applications.
  • 3. 3 Introduction  Artificial evolution of Neural Network using Genetic Algorithm  Evolving what?  Evolving weights in fixed topology • The weight space is explored through the crossover of network weight vectors and through the mutation of single network weight • The goal of fixed-topology NE is to optimize the connection weights that determine the functionality of a network  Topology and Weight Evolving Artificial Neural Networks (TWEANNs) • Not only the connection weights but also the topology of NN affects their functionality  Evolving structure incrementally presents several technical challenges  How to represent disparate topologies to cross over in a meaningful way?  How can topological innovation that needs a few generations to be optimized be protected so that it does not disappear from the population prematurely?  How can topologies be minimized throughout evolution without the need for a specially contrived fitness function that measures complexity? NeuroEvolution
  • 4. 4 Background  How to encode a neural network?  Direct  Genome specifies explicitly the phenotype  Many encoding techniques purposed from binary encoding to graphic encoding  Binary Encoding  A bit string represents the connective matrix of networks  Advantages: Evolution can be performed with a standard GA  Drawbacks: Computational space is square with respect to nodes  Graph Encoding  The number of nodes in the two-dimensional grid that represents the graph version of the genome  Indirect  Genome specifies how to build a network  Allows a more compact representation than direct encoding, because every connection and node are not specified in the genome, although they can be derived from it Issues with TWEANNS
  • 5. 5 Background  Mating(Crossover)  Competing Conventions  The competing conventions problem arises when there is more than one way of representing information in a phenotype  Two neural networks: both encode the same function  No matter how they are crossed their children will lack information  Can be solved using historical marking Issues with TWEANNS
  • 6. 6 Background  Protecting Innovations  The Innovations usually involves with larger network with more connections  Adding a new connection can reduce fitness before its weight has a chance to optimize  Some generations are required to optimize the new structure and make use of it  Larger networks require more time to be optimized and can not compete with smaller ones  Due to initial loss of fitness caused by the new structure, the innovation is unlikely to survive in the population long enough to be optimized  Speciation is an effective technique to avoid competition and mating of networks too different Issues with TWEANNS
  • 7. 7 Background  Initial Populations and Topological Innovation  Initial populations of random networks to ensure topological diversity from the start  Drawbacks  under many of the direct encoding schemes, there is a chance that a network will have no path from each of its inputs to its outputs and take time to weed out of the population  Starting out with random topologies does not lead to finding desirable minimal solutions since the population starts out with many unnecessary nodes and connections already present  Solutions  Fitness penalty: difficult to know how large the penalty should be for any particular network size, particularly because different problems may have significantly different topological requirements  starting out with minimal population i.e. with no hidden nodes and grows structure only as it benefits the solution Issues with TWEANNS
  • 8. 8 NEAT  NeuroEvolutoon of Augmenting Topology can solve effectively the TWENN issues  Tracking Genes through Historical Markings  Protecting Innovation through Speciation  Minimizing Dimensionality through Incremental Growth from Minimal Structure
  • 10. 10 NEAT  Mutation in NEAT can change both connection weights and network structures Mutation
  • 12. 12 NEAT  Two genes with same history are expected to be homologous  All a system needs to do to know which genes line up with which is to keep track of the historical origin of every gene in the system  Keep a global counter; every time a neuron or synapse is added, assign the value of the counter, and increment it  When crossing over, the genes in both genomes with the same innovation numbers are lined up Historical Marking
  • 13. 13 NEAT Historical Marking Take two parent NNs: • Line up connection genes according to their historical markings • For matching genes, copy either gene into child at random • Disjoint genes (those in the middle without a partner gene) Excess genes (those at the end)
  • 14. 14 NEAT Protecting Innovation through Speciation In order to let survive useful innovations, they have to be protected NEAT solve this problem with Speciation •Similar networks are clustered •Competition and mating is restricted with same space •Fitness sharing prevents from a single species taking over the population
  • 15. 15 NEAT Protecting Innovation through Speciation A Compatibility distance is computed on the basis of the excess(E), disjoint(D) and the average weight distance W of matching
  • 16. 16 NEAT Minimizing Dimensionality through Incremental Growth from Minimal Structure NEAT biases the search towards minimal-dimensional spaces by starting out with a uniform population of networks with zero hidden nodes (i.e., all inputs connect directly to outputs) New structure is introduced incrementally as structural mutations occur Minimizing dimensionality gives NEAT a performance advantage compared to other approaches, as will be discussed next section
  • 18. Evaluating NEAT Performance  1)Can NEAT evolve the necessary structures?  2)Can it find solutions more efficiently?  For 1) build a XOR network  For 2) build a network for pole balancing case With velocity DPV Without velocity DPNV (difficult) 18
  • 19. Parameter Settings NEAT/GA  Some parameters are sensitive to population size  All  Population = 120  C1= 1.0, c2 =1.0, c3 = 0.4  Dt = 3.0  For DPNV  Population = 1000  C3 = 0.3 (allow for finer distinctions between species)  Dt = 4.0 (make room for larger c3) 19
  • 20. Parameter Settings NEAT/GA  Pmutation(weight) = 80%  90% small perturbation  10% new random value  Pmutation (new node) = 0.03  Pmutation (new link) = 0.05 (0.3 for DPNV)  High mutation is possible due to speciation  25% of offspring created with no crossover  Discard Species whose max fitness didn’t increase for 15 generations  Champion of species with Population > 5 was copied  Interspecies mating 0.001  If node Disabled in parent -> 75% of time disabled in child  All parameter values were found experimentally  Links need to be added more often than nodes 20
  • 21. Evaluation: Evolving XOR  Simple test to Verify NEAT’s ability  XOR is not linearly separable  Needs a hidden layer  Fitness = (4 –Sumi(correct – output))2  At start no hidden units, 2 input, 1 bias , 1 output  Random connection weights  Found solutions in all of 100 runs 21
  • 22. Evaluation: Evolving XOR  Optimal solution was found in 22 of 100 runs  Avg generation to solution = 32  Avg hidden nodes = 2.35  Non disabled connections = 7.48  NEAT solves XOR without trouble and keeps the topology small while doing so. 22
  • 23. Pole Balancing  https://www.youtube.com/watch?v=ERYlXxjGb6E  Known benchmark, and is a real task  Apply force to the cart to keep the poles balanced for as long as possible  State with DPV and without velocity DPNV(hard)  Cart position, cart velocity,  (pole angle, angular velocity) * 2poles  Simulated with step 0.01 seconds  all state variables scaled to [-1,1] before feeding to network  Network output every 0.02 seconds.  Long pole(0.1m) starts with 1o angle and Short pole(1.0m) stands straight  NeuroEvolutionary approach have outperformed RL in prior comparisons. 23
  • 24. Double Pole Balancing with velocity  Success if both poles balanced for 100,000 steps  Pole is balanced if between -36 to 36 degrees from vertical  Fitness = no of timesteps the both poles were balanced  Table: Neat averaged over 120 runs, others over 50 runs  SANE fixed with neurons and network blueprints  ESP = SANE + separate populations for different hidden layer configuration  ESP was the reigning champion  NEAT takes fewest evaluations to complete the task  NEAT solutions used between 0-4 hidden nodes vs ~10 for ESP 24
  • 25. Double Pole Balancing without Velocities(hard)  Used the special fitness function to prevent the system from solving the problem by simply wiggling the cart rapidly.  This forces the system to internally compute the hidden state i.e. velocity.  In addition to balancing poles for 100,000 steps,  we also measure its generalization on 625 different initial states for 1000 time steps.  Champion is selected as a representative.  To count as a solution, it must be able to generalize on 200 out of 625 states.  5^4 = 625 ( the permutation of 0.05, 0.25,0.5, 0.75 on 4 initial state variable configurations) 25
  • 26. Comparison DPNV  NEAT is the fastest and much faster than ESP (5 times)  ESP needed to re initialize on avg of 4.06 times due to being stuck  Characterized as NEAT being more reliable at avoiding deception as it is trying different approaches.  Now, we have established that NEAT is a robust technique 26
  • 27. Importance of each component of NEAT  Have argued that NEAT’s performance is due to  Historical markings,  Speciation,  Growth from minimal structure  Do we need all three?  E.g. it is possible that Historical markings and Speciation is sufficient for performance.  We perform series of ablations(surgical removal) of each component to gauge their contribution. 27
  • 28. Ablations  Used DPV (with velocities) as it is simpler. And we can expect crippled NEAT to find solutions for it.  We cannot remove Historical markings as that will lead to conventional NE  And Markings are basis for every function of NEAT. (Speciation, crossover)  Avg over 20 runs (except 120 for Non mating as it was fast to evaluate)  System performs significantly worse for every ablation. 28
  • 29. Ablation  No-Growth :  Would mean NEAT will not be able to generate hidden layer  Thus allowed to start with same fully connected hidden layer of 10 nodes  NEAT was able to speciate but only based on weight differences.  Initial Random population:  Each individual is initialized with random genome  Was 7 times slower  Suggesting it had to search higher dimensional spaces than required, wasting time  Nonspeciated:  No structural innovations can survive. Thus networks get stuck in minimal form  7 times slow as without speciation, the population quickly converges on whatever topology happens to initially perform best.  NonMating:  It took on average more evaluations to find a solution than with the mating on.  States that crossover is useful. 29
  • 30. Ablation Conclusion  All parts of NEAT work together to result its performance. 30
  • 31. Visualizing speciation over time 31 Most of the species that did not come close to a solution survived to end. Winning species was 11 generations old and didn’t take over the population.
  • 32. Insights  NEAT shows good capacity to find new structures and is efficient in difficult control tasks  If the amount of structure can be minimized thought evolution, the search space to be explored is greatly reduced leading to significant gains.  Parallel drawn between Incremental Evolution where system is trained on simpler tasks first and then using that that expertise to do better while training on harder tasks.  Solving the easier version of the task places it on the ballpark of space closer to the solution of the harder task.  Adding structure is analogues to building upon the simpler task. 32
  • 33. Insights  NEAT is not necessarily trapped even if the current network represent a local optima as it can always add more structure.  NEAT strengthens Analogy between Gas and natural evolution as it not only allows optimization of the solution.  It also performs complexification allowing solutions to become incrementally more complex. 33
  • 34. Hypothesis: Use in competitive coevolution  Authors claim once the coevolution converges onto a dominant strategy, it takes the entire set of values to represent the strategy.  If a new strategy is to take hold, it must be different rather than being more sophisticated on the same representation.  Thus using NEAT can allow for continual competitive coevolution and as it allows for new structure leading to new expressive space for elaboration.  Different strategies are protected, thus there will be multiple dominant strategies. 34
  • 35. Hypothesis: Using NEAT to integrate networks  Two networks with different specialty.  Good at Dribble vs Kicking in soccer.  In order to combine them optimally, the hidden nodes of two networks must share information so that the skills are combined effectively.  NEAT can search for right interconnections between two distinct networks.  And create Integrated supernetwork that takes advantage of the expertise of both components. 35