Example of Kernel Ridge Regression Using the scikit Library

A regression problem is one where the goal is to predict a single numeric value. For example, you might want to predict the income of a person based on their sex, age, State, and political leaning. (Note: Somewhat confusingly, “logistic regression” is a binary classification technique in spite of its name).

The scikit (short for scikit-learn or sklearn) library has a Kernel Ridge Regression (KRR) module to predict a numeric value. KRR is an advanced version of basic linear regression. The “Kernel” in KRR means the technique uses the kernel trick which allows KRR to deal with complex data that’s not linearly separable. The “Ridge” indicates KRR uses ridge regularization to limit model overfitting. I hadn’t looked at KRR in a long time so I decided to code up a quick demo.

I used one of my standard demo datasets that looks like:

# sex age   state   income   politics
#  0  0.27  0 1 0   0.7610   0 0 1
#  1  0.19  0 0 1   0.6550   1 0 0
. . .

The goal is to predict income from sex, age, State and politics. The sex column is encoded as Male = 0, Female = 1. Ages are divided by 100. The States are Michigan = 100, Nebraska = 010, Oklahoma = 001. Incomes are divided by $100,000. The politics are conservative = 100, moderate = 010, liberal = 001.

Note: Kernel ridge regression was originally designed for problems with strictly numeric predictor values. My demo example data has categorical data (sex, State, political leaning) that has been one-hot encoded. I’m not sure this is entirely principled from a mathematical point of view, but in practice encoding non-numeric data for KRR seems to work.

I made a training file with 200 items and a test file with 40 items. The complete data is listed below.

Kernel ridge regression is difficult to explain. The technique is based on simple linear regression where each predictor value is multiplied by a weight. But the technique uses a kernel method where a kernel function is applied to each training item and the item to predict. This allows the technique to deal with data that isn’t linearly separable.

The ridge part of the KRR name means that L2 regularization is applied to prevent model overfitting, which kernel techniques are often highly vulnerable to.

After loading the training data into memory, the key statements in my demo program are:

print("Creating and training KRR RBF(1.0) model ")
# model = KernelRidge(alpha=1.0, kernel='poly', degree=4)
model = KernelRidge(alpha=0.1, kernel='rbf', gamma=1.0)
model.fit(train_X, train_y)

The parameters to the KernelRidge class would take forever to explain in detail, and this is one of the difficulties with using KRR. The kernel function can be one of ‘additive_chi2’, ‘chi2’, ‘linear’, ‘poly’, ‘polynomial’, ‘rbf’, ‘laplacian’, ‘sigmoid’, ‘cosine’ and a good one must be determined by trial and error. The two most common are ‘polynomial’ and ‘rbf’ (radial basis function), but weirdly the default is ‘linear’.

One issue with regression problems is that you must implement a program-defined accuracy function. For a classification problem, a prediction is either correct or wrong. But with regression, when you predict a numeric value, you must specify what is correct prediction is. I defined an accuracy function where a prediction that is within 10% of the true value is considered correct.

I haven’t seen kernel ridge regression used very much. Neural networks are more powerful than KRR, but neural networks require lots of training data and neural networks are more difficult to fine tune.



Kernel ridge regression has been around for a long time — since about 1970. St. Patrick’s Day has been celebrated on March 17 since 1631. Here are three examples of St. Patrick’s Day garb that have varying degrees of sophistication.


Demo code. Replace “lt” (less-than) with Boolean operator symbol.

# kernel_ridge_regression.py
# Anaconda3-2022.10  Python 3.9.13
# Windows 10/11 
# scikit / sklearn 1.0.2

# predict income from sex, age, State, politics

import numpy as np
from sklearn.kernel_ridge import KernelRidge
import pickle

# sex age   state   income   politics
#  0  0.27  0 1 0   0.7610   0 0 1
#  1  0.19  0 0 1   0.6550   1 0 0

# -----------------------------------------------------------

def accuracy(model, data_X, data_y, pct_close):
  # correct within pct of true income
  n_correct = 0; n_wrong = 0

  for i in range(len(data_X)):
    X = data_X[i].reshape(1, -1)  # one-item batch
    y = data_y[i]
    pred = model.predict(X)       # predicted income

    if np.abs(pred - y) "lt" np.abs(pct_close * y):
      n_correct += 1
    else:
      n_wrong += 1
  acc = (n_correct * 1.0) / (n_correct + n_wrong)
  return acc

# -----------------------------------------------------------

def main():
  print("\nBegin kernel ridge regression using scikit demo ")
  print("Predict income from sex, age, State, political ")

  # 0. prepare
  np.random.seed(1)

  # 1. load data
  print("\nLoading data into memory ")
  train_file = ".\\Data\\people_train.txt"
  train_xy = np.loadtxt(train_file, delimiter="\t", 
    usecols=[0,1,2,3,4,5,6,7,8], comments="#", 
    dtype=np.float64)
  train_X = train_xy[:,[0,1,2,3,4,6,7,8]]
  train_y = train_xy[:,5].flatten()  # 1D required

  print("\nFirst four X predictors = ")
  print(train_X[0:4,:])
  print(" . . . ")
  print("\nFirst four target y = ")
  print(train_y[0:4])
  print(" . . . ")

  test_file = ".\\Data\\people_test.txt"
  test_xy = np.loadtxt(test_file, delimiter="\t", 
    usecols=[0,1,2,3,4,5,6,7,8], comments="#", 
    dtype=np.float64)
  test_X = test_xy[:,[0,1,2,3,4,6,7,8]]
  test_y = test_xy[:,5].flatten()  # 1D required

# -----------------------------------------------------------

  # 2. create and train KRR model
  print("\nCreating and training KRR RBF(1.0) model ")
  # KernelRidge(alpha=1.0, *, kernel='linear', gamma=None,
  #   degree=3, coef0=1, kernel_params=None
  # ['additive_chi2', 'chi2', 'linear', 'poly', 'polynomial',
  #  'rbf', 'laplacian', 'sigmoid', 'cosine']

  # model = KernelRidge(alpha=1.0, kernel='poly', degree=4)
  model = KernelRidge(alpha=0.1, kernel='rbf', gamma=1.0)
  model.fit(train_X, train_y)

  # 3. compute model accuracy
  print("\nComputing accuracy (within 0.10 of true) ")
  acc_train = accuracy(model, train_X, train_y, 0.10)
  print("Accuracy on train data = %0.4f " % acc_train)
  acc_test = accuracy(model, test_X, test_y, 0.10)
  print("Accuracy on test data = %0.4f " % acc_test)

  # 4. make a prediction
  print("\nPredicting income for M 34 Oklahoma moderate: ")
  X = np.array([[0, 0.34, 0,0,1,  0,1,0]],
    dtype=np.float32)
  pred_inc = model.predict(X)
  print("$%0.2f" % (pred_inc * 100_000))  # un-normalized

  # 5. save model
  print("\nSaving model ")
  fn = ".\\Models\\krr_model.pkl"
  with open(fn,'wb') as f:
    pickle.dump(model, f)

  # load model
  # with open(fn, 'rb') as f:
  #   loaded_model = pickle.load(f)
  # pi = loaded_model.predict(X)
  # print("%0.2f" % (pi * 100_000))  # un-normalized

  print("\nEnd scikit KRR demo ")

if __name__ == "__main__":
  main()

Training data. Replace commas with tabs or modify program code.

# people_train_.txt
#
# sex (0 = male, 1 = female), age / 100,
# state (michigan = 100, nebraska = 010,
#  oklahoma = 001),
# income / 100_000,
# politics (conservative = 100,
#  moderate = 010, liberal = 001)
#
1,0.24,1,0,0,0.2950,0,0,1
0,0.39,0,0,1,0.5120,0,1,0
1,0.63,0,1,0,0.7580,1,0,0
0,0.36,1,0,0,0.4450,0,1,0
1,0.27,0,1,0,0.2860,0,0,1
1,0.50,0,1,0,0.5650,0,1,0
1,0.50,0,0,1,0.5500,0,1,0
0,0.19,0,0,1,0.3270,1,0,0
1,0.22,0,1,0,0.2770,0,1,0
0,0.39,0,0,1,0.4710,0,0,1
1,0.34,1,0,0,0.3940,0,1,0
0,0.22,1,0,0,0.3350,1,0,0
1,0.35,0,0,1,0.3520,0,0,1
0,0.33,0,1,0,0.4640,0,1,0
1,0.45,0,1,0,0.5410,0,1,0
1,0.42,0,1,0,0.5070,0,1,0
0,0.33,0,1,0,0.4680,0,1,0
1,0.25,0,0,1,0.3000,0,1,0
0,0.31,0,1,0,0.4640,1,0,0
1,0.27,1,0,0,0.3250,0,0,1
1,0.48,1,0,0,0.5400,0,1,0
0,0.64,0,1,0,0.7130,0,0,1
1,0.61,0,1,0,0.7240,1,0,0
1,0.54,0,0,1,0.6100,1,0,0
1,0.29,1,0,0,0.3630,1,0,0
1,0.50,0,0,1,0.5500,0,1,0
1,0.55,0,0,1,0.6250,1,0,0
1,0.40,1,0,0,0.5240,1,0,0
1,0.22,1,0,0,0.2360,0,0,1
1,0.68,0,1,0,0.7840,1,0,0
0,0.60,1,0,0,0.7170,0,0,1
0,0.34,0,0,1,0.4650,0,1,0
0,0.25,0,0,1,0.3710,1,0,0
0,0.31,0,1,0,0.4890,0,1,0
1,0.43,0,0,1,0.4800,0,1,0
1,0.58,0,1,0,0.6540,0,0,1
0,0.55,0,1,0,0.6070,0,0,1
0,0.43,0,1,0,0.5110,0,1,0
0,0.43,0,0,1,0.5320,0,1,0
0,0.21,1,0,0,0.3720,1,0,0
1,0.55,0,0,1,0.6460,1,0,0
1,0.64,0,1,0,0.7480,1,0,0
0,0.41,1,0,0,0.5880,0,1,0
1,0.64,0,0,1,0.7270,1,0,0
0,0.56,0,0,1,0.6660,0,0,1
1,0.31,0,0,1,0.3600,0,1,0
0,0.65,0,0,1,0.7010,0,0,1
1,0.55,0,0,1,0.6430,1,0,0
0,0.25,1,0,0,0.4030,1,0,0
1,0.46,0,0,1,0.5100,0,1,0
0,0.36,1,0,0,0.5350,1,0,0
1,0.52,0,1,0,0.5810,0,1,0
1,0.61,0,0,1,0.6790,1,0,0
1,0.57,0,0,1,0.6570,1,0,0
0,0.46,0,1,0,0.5260,0,1,0
0,0.62,1,0,0,0.6680,0,0,1
1,0.55,0,0,1,0.6270,1,0,0
0,0.22,0,0,1,0.2770,0,1,0
0,0.50,1,0,0,0.6290,1,0,0
0,0.32,0,1,0,0.4180,0,1,0
0,0.21,0,0,1,0.3560,1,0,0
1,0.44,0,1,0,0.5200,0,1,0
1,0.46,0,1,0,0.5170,0,1,0
1,0.62,0,1,0,0.6970,1,0,0
1,0.57,0,1,0,0.6640,1,0,0
0,0.67,0,0,1,0.7580,0,0,1
1,0.29,1,0,0,0.3430,0,0,1
1,0.53,1,0,0,0.6010,1,0,0
0,0.44,1,0,0,0.5480,0,1,0
1,0.46,0,1,0,0.5230,0,1,0
0,0.20,0,1,0,0.3010,0,1,0
0,0.38,1,0,0,0.5350,0,1,0
1,0.50,0,1,0,0.5860,0,1,0
1,0.33,0,1,0,0.4250,0,1,0
0,0.33,0,1,0,0.3930,0,1,0
1,0.26,0,1,0,0.4040,1,0,0
1,0.58,1,0,0,0.7070,1,0,0
1,0.43,0,0,1,0.4800,0,1,0
0,0.46,1,0,0,0.6440,1,0,0
1,0.60,1,0,0,0.7170,1,0,0
0,0.42,1,0,0,0.4890,0,1,0
0,0.56,0,0,1,0.5640,0,0,1
0,0.62,0,1,0,0.6630,0,0,1
0,0.50,1,0,0,0.6480,0,1,0
1,0.47,0,0,1,0.5200,0,1,0
0,0.67,0,1,0,0.8040,0,0,1
0,0.40,0,0,1,0.5040,0,1,0
1,0.42,0,1,0,0.4840,0,1,0
1,0.64,1,0,0,0.7200,1,0,0
0,0.47,1,0,0,0.5870,0,0,1
1,0.45,0,1,0,0.5280,0,1,0
0,0.25,0,0,1,0.4090,1,0,0
1,0.38,1,0,0,0.4840,1,0,0
1,0.55,0,0,1,0.6000,0,1,0
0,0.44,1,0,0,0.6060,0,1,0
1,0.33,1,0,0,0.4100,0,1,0
1,0.34,0,0,1,0.3900,0,1,0
1,0.27,0,1,0,0.3370,0,0,1
1,0.32,0,1,0,0.4070,0,1,0
1,0.42,0,0,1,0.4700,0,1,0
0,0.24,0,0,1,0.4030,1,0,0
1,0.42,0,1,0,0.5030,0,1,0
1,0.25,0,0,1,0.2800,0,0,1
1,0.51,0,1,0,0.5800,0,1,0
0,0.55,0,1,0,0.6350,0,0,1
1,0.44,1,0,0,0.4780,0,0,1
0,0.18,1,0,0,0.3980,1,0,0
0,0.67,0,1,0,0.7160,0,0,1
1,0.45,0,0,1,0.5000,0,1,0
1,0.48,1,0,0,0.5580,0,1,0
0,0.25,0,1,0,0.3900,0,1,0
0,0.67,1,0,0,0.7830,0,1,0
1,0.37,0,0,1,0.4200,0,1,0
0,0.32,1,0,0,0.4270,0,1,0
1,0.48,1,0,0,0.5700,0,1,0
0,0.66,0,0,1,0.7500,0,0,1
1,0.61,1,0,0,0.7000,1,0,0
0,0.58,0,0,1,0.6890,0,1,0
1,0.19,1,0,0,0.2400,0,0,1
1,0.38,0,0,1,0.4300,0,1,0
0,0.27,1,0,0,0.3640,0,1,0
1,0.42,1,0,0,0.4800,0,1,0
1,0.60,1,0,0,0.7130,1,0,0
0,0.27,0,0,1,0.3480,1,0,0
1,0.29,0,1,0,0.3710,1,0,0
0,0.43,1,0,0,0.5670,0,1,0
1,0.48,1,0,0,0.5670,0,1,0
1,0.27,0,0,1,0.2940,0,0,1
0,0.44,1,0,0,0.5520,1,0,0
1,0.23,0,1,0,0.2630,0,0,1
0,0.36,0,1,0,0.5300,0,0,1
1,0.64,0,0,1,0.7250,1,0,0
1,0.29,0,0,1,0.3000,0,0,1
0,0.33,1,0,0,0.4930,0,1,0
0,0.66,0,1,0,0.7500,0,0,1
0,0.21,0,0,1,0.3430,1,0,0
1,0.27,1,0,0,0.3270,0,0,1
1,0.29,1,0,0,0.3180,0,0,1
0,0.31,1,0,0,0.4860,0,1,0
1,0.36,0,0,1,0.4100,0,1,0
1,0.49,0,1,0,0.5570,0,1,0
0,0.28,1,0,0,0.3840,1,0,0
0,0.43,0,0,1,0.5660,0,1,0
0,0.46,0,1,0,0.5880,0,1,0
1,0.57,1,0,0,0.6980,1,0,0
0,0.52,0,0,1,0.5940,0,1,0
0,0.31,0,0,1,0.4350,0,1,0
0,0.55,1,0,0,0.6200,0,0,1
1,0.50,1,0,0,0.5640,0,1,0
1,0.48,0,1,0,0.5590,0,1,0
0,0.22,0,0,1,0.3450,1,0,0
1,0.59,0,0,1,0.6670,1,0,0
1,0.34,1,0,0,0.4280,0,0,1
0,0.64,1,0,0,0.7720,0,0,1
1,0.29,0,0,1,0.3350,0,0,1
0,0.34,0,1,0,0.4320,0,1,0
0,0.61,1,0,0,0.7500,0,0,1
1,0.64,0,0,1,0.7110,1,0,0
0,0.29,1,0,0,0.4130,1,0,0
1,0.63,0,1,0,0.7060,1,0,0
0,0.29,0,1,0,0.4000,1,0,0
0,0.51,1,0,0,0.6270,0,1,0
0,0.24,0,0,1,0.3770,1,0,0
1,0.48,0,1,0,0.5750,0,1,0
1,0.18,1,0,0,0.2740,1,0,0
1,0.18,1,0,0,0.2030,0,0,1
1,0.33,0,1,0,0.3820,0,0,1
0,0.20,0,0,1,0.3480,1,0,0
1,0.29,0,0,1,0.3300,0,0,1
0,0.44,0,0,1,0.6300,1,0,0
0,0.65,0,0,1,0.8180,1,0,0
0,0.56,1,0,0,0.6370,0,0,1
0,0.52,0,0,1,0.5840,0,1,0
0,0.29,0,1,0,0.4860,1,0,0
0,0.47,0,1,0,0.5890,0,1,0
1,0.68,1,0,0,0.7260,0,0,1
1,0.31,0,0,1,0.3600,0,1,0
1,0.61,0,1,0,0.6250,0,0,1
1,0.19,0,1,0,0.2150,0,0,1
1,0.38,0,0,1,0.4300,0,1,0
0,0.26,1,0,0,0.4230,1,0,0
1,0.61,0,1,0,0.6740,1,0,0
1,0.40,1,0,0,0.4650,0,1,0
0,0.49,1,0,0,0.6520,0,1,0
1,0.56,1,0,0,0.6750,1,0,0
0,0.48,0,1,0,0.6600,0,1,0
1,0.52,1,0,0,0.5630,0,0,1
0,0.18,1,0,0,0.2980,1,0,0
0,0.56,0,0,1,0.5930,0,0,1
0,0.52,0,1,0,0.6440,0,1,0
0,0.18,0,1,0,0.2860,0,1,0
0,0.58,1,0,0,0.6620,0,0,1
0,0.39,0,1,0,0.5510,0,1,0
0,0.46,1,0,0,0.6290,0,1,0
0,0.40,0,1,0,0.4620,0,1,0
0,0.60,1,0,0,0.7270,0,0,1
1,0.36,0,1,0,0.4070,0,0,1
1,0.44,1,0,0,0.5230,0,1,0
1,0.28,1,0,0,0.3130,0,0,1
1,0.54,0,0,1,0.6260,1,0,0

Test data.

# people_test.txt
#
0,0.51,1,0,0,0.6120,0,1,0
0,0.32,0,1,0,0.4610,0,1,0
1,0.55,1,0,0,0.6270,1,0,0
1,0.25,0,0,1,0.2620,0,0,1
1,0.33,0,0,1,0.3730,0,0,1
0,0.29,0,1,0,0.4620,1,0,0
1,0.65,1,0,0,0.7270,1,0,0
0,0.43,0,1,0,0.5140,0,1,0
0,0.54,0,1,0,0.6480,0,0,1
1,0.61,0,1,0,0.7270,1,0,0
1,0.52,0,1,0,0.6360,1,0,0
1,0.30,0,1,0,0.3350,0,0,1
1,0.29,1,0,0,0.3140,0,0,1
0,0.47,0,0,1,0.5940,0,1,0
1,0.39,0,1,0,0.4780,0,1,0
1,0.47,0,0,1,0.5200,0,1,0
0,0.49,1,0,0,0.5860,0,1,0
0,0.63,0,0,1,0.6740,0,0,1
0,0.30,1,0,0,0.3920,1,0,0
0,0.61,0,0,1,0.6960,0,0,1
0,0.47,0,0,1,0.5870,0,1,0
1,0.30,0,0,1,0.3450,0,0,1
0,0.51,0,0,1,0.5800,0,1,0
0,0.24,1,0,0,0.3880,0,1,0
0,0.49,1,0,0,0.6450,0,1,0
1,0.66,0,0,1,0.7450,1,0,0
0,0.65,1,0,0,0.7690,1,0,0
0,0.46,0,1,0,0.5800,1,0,0
0,0.45,0,0,1,0.5180,0,1,0
0,0.47,1,0,0,0.6360,1,0,0
0,0.29,1,0,0,0.4480,1,0,0
0,0.57,0,0,1,0.6930,0,0,1
0,0.20,1,0,0,0.2870,0,0,1
0,0.35,1,0,0,0.4340,0,1,0
0,0.61,0,0,1,0.6700,0,0,1
0,0.31,0,0,1,0.3730,0,1,0
1,0.18,1,0,0,0.2080,0,0,1
1,0.26,0,0,1,0.2920,0,0,1
0,0.28,1,0,0,0.3640,0,0,1
0,0.59,0,0,1,0.6940,0,0,1
This entry was posted in Scikit. Bookmark the permalink.

1 Response to Example of Kernel Ridge Regression Using the scikit Library

  1. Pingback: Regression Using scikit Kernel Ridge Regression -- Visual Studio Magazine

Leave a comment