An Example of the Python SciPy line_search() Function

The term “line search” refers to a classical statistics technique to minimize a function. The Python SciPy code library has a line_search() function that is a helper function for a line search but line_search() doesn’t actually do a line search. (Classic example of a bad function name.)

I did a quick Internet search, looking for a code example of using the line_search() function to actually perform a line search minimization and found . . . nothing. This raised an immediate red flag. I found a coulple of examples of how to call the line_search() function, all of them copied from the SciPy documentation. But there were no examples of using line_search() to do a line search. So, I set out to do so.

First I set up a function to minimize: f(x,y) = 4x^2 + 3x^2 + 1. The function has an obvious minimum at (x,y) = (0,0) when the value of the function is 1. The function looks like:

Notice that the function is convex, which loosely means it doesn’t have severe peaks and valleys, which in turn means it’s relatively easy to find the minimum.

The line_search() function accepts a function to minimize, the gradient function for the function to minimize, a starting point, and a search direction. The function returns six values but the most important one is result[0] which is called the alpha value. The alpha value is a value that you can add to the start point, in the direction of the search direction parameter, so that the new point will be closer to the minimum.

For example, let f(x,y) = 4x^2 + 3x^2 + 1, and suppose the start point is xk = (10, 10) and the direction is pk = (-0.10, -0.10). And then suppose the alpha return result is 4.0. This means a new point that’s closer to the minimum is:

xk = xk + (alpha * pk)
   = (10, 10) + 4 * (-0.10, -0.10)
   = (10, 10) + (-0.40, -0.40)
   = (9.60, 9.60)

which is in fact closer to the minimum at (0, 0).

It turns out that line_search() returns None if it can’t make any progress. So, to find the minimum of a function func, using a gradient grad, the steps in pseudo-code are:

set start point xk
set direction pk
loop until alpha is None
  alpha = line_search(func, grad, xk, pk)
  xk = xk + (alpha * pk)

I did some experiments and there were lots of problems. Briefly, line search worked well for a simple convex function with just one single independent variable, but line search didn’t work well for functions of two or more independent variables.

Just for fun, I coded up a demo to find the minimum of f(x,y) = 4x^2 + 3x^2 + 1 using a machine learning approach with a learning rate (instead of using the line_search() approach). In pseudo-code:

set start point xk
set learn rate lr
loop until satisfied
  g = grad(xk)
  xk = xk - (lr * g)

The idea here is that the gradient gives information about what direction and how far to move xk so that the value of the function will get smaller.

Three examples of portraits from artists whose style is based on lines, from moderately line-focused to extremely line-focused. Left: By artist Ikenaga Yasunari. Center: By artist Mads Berg. Right: By artist Ichiro Tsuruta.


import numpy as np
from scipy.optimize import line_search
import warnings

def func(x):
  # x is a np.float32 vector
  return 4*(x[0]**2) + 3*(x[1]**2) + 1  # scalar

def grad(x):
  return np.array([8*x[0], 6*x[1]], dtype=np.float32)

def main():
  print("\nBegin minimize f(x,y) = 4x^2 + 3y^2 + 1 demo")

  print("\nGradient descent ala machine learning, LR = 0.10: ")
  xk = np.array([10.0, 10.0], dtype=np.float32)
  lr = 0.10
  for i in range(15):
    g = grad(xk)
    xk = xk - (lr * g)
    fv = func(xk)
    print("i = %4d  x,y = %0.6f %0.6f fv = %0.4f" % (i, xk[0], \
xk[1], fv))

  print("\nUsing scipy line_search(): ")
  xk = np.array([10.0, 10.0], dtype=np.float32)  # start
  pk = np.array([-0.10, -0.10], dtype=np.float32)  # direction

  for i in range(200):
    results = line_search(func, grad, xk, pk)  # will warn
    # print(results)
    alpha = results[0]
    if alpha is None:
      print("line_search done ")

    xk = xk + alpha * pk
    fv = func(xk)
    print("i = %4d  x,y = %0.6f %0.6f fv = %0.4f" % (i, xk[0], \
xk[1], fv))

  print("\nEnd demo ")

if __name__ == "__main__":
This entry was posted in Machine Learning. Bookmark the permalink.

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s