Neural Network Training using Simplex Optimization

I wrote an article titled “Neural Network Training using Simplex Optimization” in the October 2014 issue of Visual Studio Magazine. See A neural network is like a complicated math equation that has variables and coefficients. Training a neural network is the process of finding the values for the coefficients (which are called weights and biases).


To find good values for weight and biases, you use training data that has known inputs and output values. You want to minimize the error between computed outputs and actual outputs. This is called a numerical optimization problem.

There are about a dozen or so common numerical optimization techniques that can be used to train a neural network. By far the most common technique is called back-propagation. Another technique that is becoming increasingly popular is called particle swarm optimization.

One of the oldest numerical optimization techniques is called simplex optimization. A simplex is a triangle. Simplex optimization uses three candidate solutions. There are many variations of simplex optimization. The most common is called the Nelder-Mead algorithm. My article uses a simpler version of simplex optimization that doesn’t have a particular name.

Simplex optimization is also known as amoeba method optimization. Not because it mimics the behavior of an amoeba, but rather, because if you graph the behavior of the algorithm, which is based on geometry, it looks like a triangle oozing across the screen, vaguely looking like an amoeba.

This entry was posted in Machine Learning. Bookmark the permalink.