In this post, I will try to explain a method of machine learning called Evolving Neural Networks through Augmenting Topologies (NEAT).

Introduction:

I love to learn. It is really exciting to open up a book or a paper about a topic I have not encountered yet, and to try to understand it. Some people like to solve crossword puzzles, sudoku or riddles in their free time to stimulate their consciousness — I love to read papers and try to implement the proposed algorithms. And since I love to learn, and I sadly won’t be able to learn all the things I would actually like to learn, I am especially interested in learning how to make computers learn for me and present me interesting information that might maximize my learning efforts in effectivity and efficiency. Those algorithms, helping to structure complex information, are part of a field called “expert systems”. The goal therefore is to “create” an “expert” that you can consult whenever you need to make a decision. Let us have a high level look at a model that helps to create those “experts” called NEAT. This article will introduce the concept. In future articles, I might actually share implementation examples.

What is NEAT?

Neat stands for “Neural Networks through Augmented Topologies” and describes algorithmic concepts of self-learning machines that are inspired by genetic modification in the process of evolution.

Life itself is absolutely fascinating. Whenever I look at nature, I tend to see a common denominator in all living organisms called information. Information seems to be the core inheritance of reproduction. Nature was able to optimize information passing over the past 3.7 billion years and created many different species that coexist. Charles Darwins concept of the survival of the fittest enlightened our understanding on what makes certain species evolve, while other species go extinct. The goal of evolutionary computer scientists is to build systems with the quest to solve complex problems by imitating natural evolution.

The easiest and very simplified way to describe how NEAT works is by an example. If you had to design an expert system that plays a game for you in the most optimal way, what would be relevant factors?

Well, firstly, it is important to define all potential actions the “player” is allowed to execute. Super Mario for example is able to jump, duck, walk left, right, twist, run fast and so on. If we connect a machine to these variables and allow it to execute them, it will be potentially able to do something.

Secondly, it is important to define a goal for the computer. NEAT works with a concept called “Fitness Score”. The Fitness Score is a mathematical function that rewards success. In a game like Mario, the fitness score would be the advancement of the player towards the finish line. There can be many more variables included within the Fitness Score function, like collected coins, defeated enemies or the overall time to finish.

And thirdly, it is important to define the rules of evolution. NEAT allows mutation of nodes, new connection in between of nodes and the inheritance of the most fittest neural networks into new offsprings. Additionally, NEAT assures that different kind of “species” can co-exist, until they are allowed to compete with each other to produce new and fitter offsprings. To assure that the fittest species survive, already tried networks don’t repeat themselves in the future and existing networks optimize themselves, NEAT adds an innovation number to each gene that works as a historic marker.

The figure above shows mutation by adding a connection and a mutation by adding a node. The connection from 2 to 4 is disabled in the upper example within the figure and a new connection from 3 to 5 is created instead. In the lower example for node mutation, one can see how the connection from 2 to 4 is disabled and a new node 6 is introduced and how it generates two new connections from 3 to 6 and from 6 to 4.

The figure above shows how the evolution takes place. Parent 1 and Parent 2 do have similar structures in nodes and connections, but they also do have distinctions. The machine now uses a binary logic to either include a node / connection or to get rid of it. The basic decision mechanism is based on true and true is true, true and false is false, false and false is false. This way the offsprings assure that the inherited information are promising effectivity regarding the fitness score, while new evolutionary nodes and/or connections are considered in the offspring and disabled parent nodes stay disabled. “Matching genes are inherited randomly, whereas disjoint genes (those that do not match in the middle) and excess genes (those that do not match in the end) are inherited from the more fit parent. In this case, equal fitnesses are assumed, so the disjoint and excess genes are also inherited randomly. The disabled genes may become enabled again in future generations: there’s a preset chance that an inherited gene is disabled if it is disabled in either parent.” [Stanley, Miikkulainen, Page 109, NEAT]

Now that all is more or less clear, NEAT can work its way through the game by iterating and evolving new neural networks to reach the optimal fitness score. I highly recommend to read the paper, it is very well written and, with a little bit of googling effort, understandable for beginners. But if you would like to have a quick crash course, you can also watch this wonderful video on youtube:

NEAT is great to solve complex problems by dividing them into smaller problems that can be optimized. NEAT agents could be developed for many different individual tasks and afterwards joint to solve more complex problems. An example could be using NEAT for the creation of new pharmacy by firstly laying out all atoms that can be used, by secondly, defining a fitness function that makes the simulation understand the reward system and by finally defining the rules of evolution. Another NEAT could be used for choosing manufacturing technologies for producing the newly designed pharmacy. Once these two system are brought together a new fitness function optimized for cost efficiency could create the optimal production method for a new designed pharmacy. This sounds great in theory, but is obviously a very complex and complicated task that takes a lot of effort in research and development.

Here is the paper: