Welcome to NEAT-Python’s documentation!¶
Todo
This is a draft version of the documentation, for Dr. Allen Smith’s multiparam_funcs branch, and is likely to rapidly change. Please see neat-python.readthedocs.io for an official version for the NEAT-Python master branch.
NEAT is a method developed by Kenneth O. Stanley for evolving arbitrary neural networks. NEAT-Python is a pure Python implementation of NEAT, with no dependencies other than the Python standard library.
Note
Some of the example code has other dependencies; please see each example’s README.md file for additional details and installation/setup instructions for
the main code for each. In addition to dependencies varying with different examples, visualization of the results (via visualize.py
modules) frequently
requires graphviz and/or matplotlib. TODO: Improve
README.md file information for the examples.
Support for HyperNEAT and other extensions to NEAT is planned once the fundamental NEAT implementation is more complete and stable.
For further information regarding general concepts and theory, please see Selected Publications on Stanley’s website, or his recent AMA on Reddit.
If you encounter any confusing or incorrect information in this documentation, please open an issue in the GitHub project.
Contents:
- NEAT Overview
- Installation
- Configuration file description
- Overview of the basic XOR example (xor2.py)
- Customizing Behavior
- Overview of builtin activation functions
- General-use activation functions (single-parameter)
- General-use activation functions (multiparameter)
- CPPN-intended activation functions (single-parameter)
- CPPN-intended activation functions (multiparameter)
- CPPN-use activation functions (single-parameter)
- CPPN-use activation functions (multiparameter)
- Continuous-time recurrent neural network implementation
- Module summaries
- Genome Interface
- Reproduction Interface
- Glossary