NEAT (NeuroEvolution of Augmenting Topologies) is a popular neuroevolution algorithm, one of the few such algorithms that evolves the architectures of

Search code, repositories, users, issues, pull requests...

submited by
Style Pass
2024-05-06 22:00:09

NEAT (NeuroEvolution of Augmenting Topologies) is a popular neuroevolution algorithm, one of the few such algorithms that evolves the architectures of its networks in addition to the weights. For more information, see this research paper: http://nn.cs.utexas.edu/downloads/papers/stanley.ec02.pdf.

HyperNEAT is an extension to NEAT that indirectly encodes the weights of the network (called the substrate) with a separate network (called a CPPN, for compositional pattern-producing network). For more information on HyperNEAT, see this website: http://eplex.cs.ucf.edu/hyperNEATpage/.

Adaptive HyperNEAT is an extension to HyperNEAT which indirectly encodes both the initial weights and an update rule for the weights such that some learning can occur during a network's "lifetime." For more information, see this research paper: http://eplex.cs.ucf.edu/papers/risi_sab10.pdf.

PyTorch NEAT builds upon NEAT-Python by providing some functions which can turn a NEAT-Python genome into either a recurrent PyTorch network or a PyTorch CPPN for use in HyperNEAT or Adaptive HyperNEAT. We also provide some environments in which to test NEAT and Adaptive HyperNEAT, and a more involved example using the CPPN infrastructure with Adaptive HyperNEAT on a T-maze.

Leave a Comment