Safe Calculation of ln(exp(a) + exp(b))

I’ve been working with an interesting system called a prediction market. The goal of a prediction market is to predict an outcome such as the winner of a basketball tournament with 64 teams. You get a group of experts and let them buy and sell shares of teams in a way that’s similar to the way the stock market works. The number of shares held can be mapped to a probability of winning.

One of the key underlying math equations of a prediction market is a Cost function. Suppose you have three options (X, Y, Z). If (x, y, z) are the number of shares held by the experts, the Cost function is:


Here b is a scaling constant and doesn’t affect the main idea of this post. A straightforward calculation of the Cost function could blow up if x or y or z is very large because if some value v is large then exp(v) can be astronomically large and cause arithmetic overflow.

If you toss out the irrelevant b constant (irrelevant in terms of the overflow problem), the simplified Cost function with three input values is:


In order to have a prediction market that scales to thousands of users, I needed a way to safely calculate ln(exp(a) + exp(b) + exp(c)). It took me a couple of hours of fiddling with the equation using properties of the exp() and ln() functions to come up with a solution. It’s best illustrated by an example.

Suppose a = 5, b = 8, and c = 2. The Cost function (without the liquidity constant) is:

C = ln(exp(5) + exp(8) + exp(2))
  = ln(148.41 + 2980.96 + 7.39)
  = ln(3136.76)
  = 8.05

The trick, which I’ll call softsum max trick because it resembles a similar idea when calculating something called the softmax function, is:

C = ln(exp(5-8) + exp(8-8) + exp(2-8)) + 8
  = ln(exp(-3) + exp(0) + exp(-6)) + 8
  = ln(0.05 + 1.00 + 0.01) + 8
  = ln(1.06) + 8
  = 0.05 + 8
  = 8.05

In words, instead of using each raw input value, you use each value minus the max input value, which keeps the exp() of each small.

Here’s a C# demo:

using System;
namespace SoftsumTrick
  class Program
    static void Main(string[] args)
      Console.WriteLine("\nBegin softsum demo \n");
      Console.WriteLine("Goal is to safely compute ");
      Console.WriteLine(" ln(exp(a) + exp(b) + . . )");
      int[] vals = new int[] { 5, 8, 2 };
      Console.Write("\nInputs values = ");
      for (int i = 0; i < vals.Length; ++i)
        Console.Write(vals[i] + " ");

      Console.WriteLine("ln(sum(exp(x)) directly = ");
      double v1 = Crude(vals);

      Console.WriteLine("\nln(sum(exp(x)) safely = ");
      double v2 = Softsum(vals);

      Console.WriteLine("\nEnd softsum demo \n");
    } // Main

    static double Crude(int[] vals)
      // direct calculation (could overflow)
      double sum = 0.0;
      for (int i = 0; i < vals.Length; ++i)
        sum += Math.Exp(vals[i]);
      return Math.Log(sum);

    static double Softsum(int[] vals)
      // safe calculation
      int max = vals[0];
      for (int i = 1; i < vals.Length; ++i)
        if (vals[i] > max)
          max = vals[i];
      double sum = 0.0;
      for (int i = 0; i < vals.Length; ++i)
        sum += Math.Exp(vals[i] - max);
      return Math.Log(sum) + max;

  } // Program

} // ns


The softsum trick can then be applied to the actual Cost function, which I’ll describe in a future post.

This entry was posted in Machine Learning, Prediction Markets. Bookmark the permalink.