The NeuroEvolution of Augmenting Topologies (NEAT) Users Page

Now there is a HyperNEAT Users Page too!

Last Updated 9/17/13 (list of updates)

I created this page because of growing interest in the use and implementation of the NEAT method. I have been corresponding with an expanding group of users. Because the same points come up more than once, it makes sense to have a place where people can come and tap into the expanding knowledge we have about the software and the method itself.

I continue to support the NEAT community from my new position as an assistant professor at the University of Central Florida in the School of Electrical Engineering and Computer Science, where my research continues as director of the Evolutionary Complexity Research Group (EPlex).

We also developed an extension to NEAT called HyperNEAT that can evolve neural networks with millions of connections and exploit geometric regularities in the task domain. The HyperNEAT Page includes links to publications and a general explanation of the approach.

New! So that the record of how I thought of NEAT is not lost, I have made available copies of the original notes from January 2000 where I first thought of NEAT. It turns out that I unintentionally documented my own thinking process when I thought of NEAT because I was scribbling notes about it as I thought it through. These are those notes, with some commentary to explain what I was thinking about.

Tutorial Available: Wesley Tansey has provided a helpful tutorial on setting up a Tic-Tac-Toe experiment in SharpNEAT 2. It should be instructive for anyone aiming to create an experiment in SharpNEAT.

-Kenneth Stanley

Want to see NEAT used in a video game? Check out Galactic Arms Race or NERO.

Contents

Introduction - Which version of NEAT should you use?

NEAT Users Group - Join the group to discuss your projects

Book available with step by step chapter on NEAT

NEAT Software FAQ - Questions that mostly relate to coding issues or using the actual software.

General NEAT Methodology FAQ - More broad questions regarding general methodology or philosophy behind NEAT.

NEAT Users and Projects

The Future of NEAT

NEAT Publications

Introduction

This page is intended for NEAT users, particularly those using one of the available versions of NEAT or writing a version of their own. Over the past few years, several versions of NEAT have become available for different platforms and languages. The remainder of this introduction will introduce the available software, and attempt to help the user choose the right package for his or her needs. The current crop of NEAT software includes: The question for many people first coming to NEAT is which package is right for me? There are several factors to consider:

First, how closely does the package you want follow my (Ken's) original NEAT source code? For most people this won't be a big issue, since all the packages work. However, if you want to strictly reproduce my experimental results, you may want the most faithful code available. For this purpose, either my original version or Ugo Vierruci's JNEAT are probably the best choices, since JNEAT was written directly off of my original C++ source code.

Second, what is your favorite platform? If you prefer Linux, my C++ version is appropriate. If you prefer Windows, Mat Buckland's C++ would be better. Since Java and Matlab run on all platforms, if you want to use Java or Matlab, platform is not a consideration. SharpNEAT, written in C#, can also run on either platform.

Third, what language do you prefer? Since NEAT is available in C++, Java, Matlab, Delphi, and C#, there are now several choices.

Fourth, what experiments would you like built in? JNEAT, Matlab NEAT, SharpNEAT, ANJI, and the original NEAT C++ all come with XOR. The original NEAT C++ and SharpNEAT also come with pole balancing. Windows NEAT C++ comes with a nice graphical minesweeper demo. Delphi NEAT has some entertaining 3D robot control experiments. SharpNEAT includes an incremental predator/prey experiment. ANJI comes with Tic Tac Toe. If you are planning a new experiment, it may be helpful to look at the code for similar experiments.

Your best option will be based on some combination of the above considerations. Of course, if you want NEAT for platform X or language Y and they aren't available, you may want to write your own version of NEAT. I am happy to hear about such projects so you should let me know what you're thinking of. I can give you advice or point you to any similar projects you may not be aware of.

NEAT Users Discussion Group

Derek James created a NEAT Users Group on Yahoo to encourage the discussion of ideas, questions, and variations of NEAT. The community of NEAT users and those interested in NEAT can benefit greatly from the availability of this forum. Please feel free to join the discussion!

Paperback Book Available with Chapter on NEAT

For people who are interested in learning about NEAT, but prefer explanations intended for general audiences to reading research-level papers, I am happy to recommend
AI Techniques for Game Programming by Mat Buckland. Most of the final chapter of this book describes NEAT in a fun and simple style. The book also comes with source code. This book is a good resource for hobbyists or video game programmers interested in AI techniques. (Researchers should still refer to the NEAT research publications available below.) It also includes useful introductions to genetic algorithms and neural networks. Note that coding questions in the FAQ on this page refer to my own source code release and not the code in this book, though some answers may still be useful.

NEAT Software FAQ

Please note: In general, most answers refer to the original C++ code intended for Linux, which I wrote myself. The Java code, written by
Ugo Vierucci , was made to follow the C++ code faithfully, so most answers will apply to the Java version as well. However, pole balancing experiments are not included in the Java version so any questions on pole balancing will not apply to JNEAT. Mat Buckland's Windows version and also Christian Mayr's Matlab version were written independently. Therefore, code-related answers below probably do not apply to the Windows or Matlab distributions.

General NEAT Methodology FAQ

This FAQ addresses questions that are more abstract or philosophical in nature.

NEAT Users and Projects

There is so much work going on with NEAT that I have stopped trying to keep track of it here. Instead, one way to find information projects with NEAT (aside from the several NEAT packages available) is to search Google for "Augmenting Topologies" or to search Google Scholar for "Augmenting Topologies" also.

The Future of NEAT

Hypercube-based NEAT (HyperNEAT) is the future of NEAT.

HyperNEAT software and source has also become available:

NEAT Publications

Note More recent papers from after I moved to UCF are here.

Note: The last three papers are the best introduction to NEAT

The following papers talk about the NEAT method. I conclude each reference with some commentary describing what the paper is about. The links also lead to abstracts before you download the papers. My homepage has links to all my papers, including papers not specifically about NEAT.

Ph.D. Dissertation: EFFICIENT EVOLUTION OF NEURAL NETWORKS THROUGH COMPLEXIFICATION
Kenneth O. Stanley
Department of Computer Sciences, The University of Texas at Austin
Technical Report~AI-TR-04-39, August 2004.
Comment: Extensive descriptions of both the NEAT method and associated experiments. 180 pages.

EVOLVING NEURAL NETWORK AGENTS IN THE NERO VIDEO GAME
Kenneth O. Stanley, Bobby D. Bryant, and Risto Miikkulainen
Department of Computer Sciences, The University of Texas at Austin
Proceedings of the IEEE 2005 Symposium o n Computational Intelligence and Games (CIG'05). Piscataway, NJ: IEEE, 2005.
Winner of the Best Paper Award at CIG'05
Comment: This 8-page paper describes how NEAT was adapted to work in real-time in the NERO video game.

AUTOMATIC FEATURE SELECTION IN NEUROEVOLUTION
Shimon Whiteson, Peter Stone, Kenneth O. Stanley, Risto Miikkulainenn and Nate Kohl
Department of Computer Sciences, The University of Texas at Austin
To appear in:Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2005).
Comment: Describes how starting NEAT with most inputs disconnected and letting NEAT decide which ones to connect is more effective than starting with all inputs connected, particularly in problems with many potential inputs.

REAL-TIME NEUROEVOLUTION IN THE NERO VIDEO GAME
Kenneth O. Stanley, Bobby D. Bryant, and Risto Miikkulainen
Department of Computer Sciences, The University of Texas at Austin
IEEE Transactions on Evolutionary Computation, volume 9, number 6, pages 653-668, December 2005.
Comment: Journal paper describes how NEAT was enhanced to run in real-time inside a new genre of video game where agents are trained by the player during gameplay.

EVOLVING A ROVING EYE FOR GO
Kenneth O. Stanley and Risto Miikkulainen
Department of Computer Sciences, The University of Texas at Austin
Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2004). New York, NY: Springer-Verlag, 2004
Comment: This conference paper describes how a roving eye of a fixed input size can be evolved to play Go on boards of different sizes, and how an eye trained on a smaller board can go on to learn on a larger board better than an eye trained starting on the larger board.

EVOLVING ADAPTIVE NEURAL NETWORKS WITH AND WITHOUT ADAPTIVE SYNAPSES
Kenneth O. Stanley, Bobby D. Bryant, and Risto Miikkulainen
Department of Computer Sciences, The University of Texas at Austin
Proceedings of the 2003 IEEE Congress on Evolutionary Computation (CEC-2003). Canberra, Australia: IEEE Press, 2003
Comment: Conference paper describing experiments evolving neural networks with synpatic weights that change over time according to local Hebbian rules. Explains how traits are used in NEAT: See Section 2.4 of this paper. Traits are referred to as a rule set in the paper.

COMPETITIVE COEVOLUTION THROUGH EVOLUTIONARY COMPLEXIFICATION
Kenneth O. Stanley and Risto Miikkulainen
Department of Computer Sciences, The University of Texas at Austin
Journal of Artificial Intelligence Research 21: 63-100, 2004.
Comment: This 38 page journal article expands on the importance complexification from a minimal starting point, and shows how it leads to the discovery of more complex structures than would otherwise be possible.

CONTINUAL COEVOLUTION THROUGH COMPLEXIFICATION
Kenneth O. Stanley and Risto Miikkulainen
Department of Computer Sciences, The University of Texas at Austin
To appear in Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2002). San Francisco, CA: Morgan Kaufmann, 2002
Comment: Conference paper about using the idea of complexification in NEAT to allow for continual elaboration on strategies leading to a sustained arms race in competitive coevolution.

EFFICIENT EVOLUTION OF NEURAL NETWORK TOPOLOGIES
Kenneth O. Stanley and Risto Miikkulainen
Department of Computer Sciences, The University of Texas at Austin
Proceedings of the 2002 Congress on Evolutionary Computation (CEC '02). Piscataway, NJ: IEEE, 2002
Comment: A short conference paper that describes NEAT from the perspective of deconstructing the system using ablation studies.

EFFICIENT REINFORCEMENT LEARNING THROUGH EVOLVING NEURAL NETWORK TOPOLOGIES
Kenneth O. Stanley and Risto Miikkulainen
Department of Computer Sciences, The University of Texas at Austin
To appear in Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2002). San Francisco, CA: Morgan Kaufmann, 2002
Comment: A conference paper that describes NEAT from the point of view of performance and visualization.
Winner of the Best Paper Award in Genetic Algorithms

EVOLVING NEURAL NETWORKS THROUGH AUGMENTING TOPOLOGIES
Kenneth O. Stanley and Risto Miikkulainen
Department of Computer Sciences, The University of Texas at Austin
Evolutionary Computation 10(2):99-127, 2002.
Comment: A journal paper including a comprehensive description of the NEAT method, as well as substantial background information and performance analysis. Should be useful for implementing your own version of NEAT.

Note: A complete list of my publications can be found on my homepage.

Contact me here:

kstanley@cs.ucf.edu

Last Updated 9/5/11

Updates: This page has been substantially revised on 8/31/03. New answers were added to the ends of both FAQs as of 5/8/03. A new paper on evolving synaptic plasticity (using traits) is available through the this page as of 6/22/03. A NEAT User's Group opened on Yahoo on 8/26/03. More projects listed on 12/4/03. On 12/21/03 Mattias Fagerlund released full source code and demos for Delphi NEAT. 2/16/04: A new question explains how to start NEAT with genomes with some inputs initially disconnected from the the network. 3/5/04: The question on starting disconnected now links to code for mutate_add_sensor both in Java and C++. 3/24/04: Added link to new paper on evolving a roving eye with NEAT for Go. 4/7/04: Added link to SharpNEAT, a new NEAT software package. 6/9/04: Added link to ANJI (Another NEAT Java Implementation). 9/14/04: Added question and answer about testing with XOR. 9/15/04: Elaborated answer on testing with XOR. 10/27/04: Added 2 more papers to the publications list. 5/4/05: Added 2 more papers (feature selection and NERO award winner) and updated answers to FAQ questions on starting with some inputs disconnected, also pointing them to the new paper. 9/7/05: Linked Ashot Petrosian's NEAT Code Documentation 1/20/06: Changed link to NERO tech report to the journal paper that replaced it. 9/8/06: Added links to rtNEAT source code and NEAT4J, a new Java-based NEAT release. 10/12/06: Updated answer to question, "Can I start NEAT with some inputs disconnected?" 5/20/07: Added link to EPlex in intro text. 1/10/08: Fixed broken links, updated out-of-date text, added HyperNEAT links. 1/16/08: Updated answer to "Have you tried using non-sigmoid activation functions" to cite CPPNs. 4/24/08: Added link to my original notes wherein I thought of the NEAT algorithm. 5/9/08: Updated mention of HyperNEAT at top of page to link to more papers. 2/8/09: Added link to eplex publications. 4/17/09: Linked to HyperNEAT Users Page. 1/6/10: Removed broken link to NEAT Code Documentation. 3/9/10: Fixed link to DelphiNEAT (now goes to eplex server) 6/15/10: Added links to NEAT4J and Encog NEAT. 7/19/10: Added question, "Can you explain some of the NEAT Parameters?" 7/19/10: Added link to Wesley Tansey's SharpNEAT 2 TTT tutorial. 9/5/11: Small change to HyperNEAT text and links at top of page. 1/12/12: Added link to ObjectiveNEAT package. 9/19/12: Added link to Peter Chervenski's MultiNEAT software package. 3/2/13: Added link to Eric Laukien's NEAT Visualizer package. 9/17/13: Added link to Fernando Torres' reorganized github version of NEAT C++.