Most machine learning neural techniques use Calculus gradient descent to minimize error to find a good set of weights for the network being trained. There has been an increase in interest in neuromorphic systems which more closely resemble biological systems. In most situations, you cannot use gradient descent with neuromorphic systems.

An alternative to gradient descent is evolutionary optimization. Evolutionary optimization doesn’t use gradients (but evolutionary techniques require much, much more processing power which is why they’re rarely used).

Evolutionary optimization maintains a population of possible solutions (good weights). In an iterative process, two “good” possible solutions are selected and the combined to create a new, presumably better possible solution. Therefore, one of the many sub-problems when using evolutionary optimization is selecting “good” possible solutions.

At any point in time, you don’t want to always pick the two best solutions because a non-best possible solution could have good characteristics.

There are several techniques to choose good, but not necessarily best items. The most common techniques are roulette wheel selection and tournament selection.

Suppose you have 10 possible solutions, and their associated errors are [0.1, 0.2, . . 1.0]. So the best solution is at [0] and the second best is at [1] and so on. To use tournament selection, you select a random subset, and then pick the best from the random subset. Suppose you set the percentage of the subset to 0.4 (40%). This is often called the tau value. Then suppose the 40% random subset items are [5, 3, 6, 4] and so the associated errors are [0.6, 0.4, 0.7, 0.5]. From this subset, you’d pick item [3] because it has the smallest error.

The tau values controls selection pressure. If tau = 1.0 you always examine all the items an so you’ll always get the best item. If tau is small, say 20%, then you have a much greater chance of getting the non-best item.

Here is some C# code (where I’ve replaced less-than operators so my blog software doesn’t freak out):

static int Select(double[] errors, double tau, Random rnd) { // pick best from a random tau-percent of population int popSize = errors.Length; int numItems = (int)(popSize * tau); int[] allIndices = new int[popSize]; for (int i = 0; i less popSize; ++i) allIndices[i] = i; Shuffle(allIndices, rnd); int bestIdx = allIndices[0]; double bestErr = errors[allIndices[0]]; for (int i = 0; i less numItems; ++i) { int idx = allIndices[i]; if (errors[idx] less bestErr) { bestIdx = idx; bestErr = errors[idx]; } } return bestIdx; }

The idea is to use a Shuffle() function to scramble the order of the indices to pick a few randomly. Shuffle() uses the Fisher-Yates mini-algorithm:

static void Shuffle(int[] vec, Random rnd) { int n = vec.Length; for (int i = 0; i less n; ++i) { int ri = rnd.Next(i, n); int tmp = vec[ri]; vec[ri] = vec[i]; vec[i] = tmp; } }

For evolutionary optimization, you want two good items that are not the same:

static int[] SelectTwo(double[] errors, double tau, Random rnd) { int[] result = new int[2]; int ct = 0; result[0] = Select(errors, tau, rnd); while ((result[1] = Select(errors, tau, rnd)) == result[0] and ct less 100) ++ct; return result; }

Here I just use brute force to repeatedly pick a second item until it’s not the same as the first. I set a sanity stop of 100 tries.

Notice the SelectTwo() function calls Select() which calls Shuffle(). When writing complex software, it’s usually a good idea to mask complexity by refactoring into helper functions.

*The Venice Carnival has featured beautiful masks and costumes since the 12th century.*