NEAT (NeuroEvolution of Augmenting Topologies) is an evolutionary algorithm that can evolve both the structure and weights of neural networks. It addresses issues with evolving network topologies by using historical markings to track genes, speciation to protect innovations, and incremental growth from minimal structures. Evaluations show NEAT can efficiently evolve solutions to problems like the XOR task and pole balancing. Ablation studies indicate all components of NEAT are important for its performance. The document hypothesizes NEAT could be useful for competitive coevolution by allowing multiple strategies, and for integrating separate networks by evolving connections between them.
Related topics: