## Ordinal Classification Using PyTorch in Visual Studio Magazine

I wrote an article titled “Ordinal Classification Using PyTorch” in the October 2021 edition of the Microsoft Visual Studio Magazine. See https://visualstudiomagazine.com/articles/2021/10/04/ordinal-classification-pytorch.aspx.

The goal of an ordinal classification problem is to predict a discrete value, where the set of possible values is ordered. For example, you might want to predict the price of a house (based on predictors such as area, number of bedrooms and so on) where the possible price values are 0 (low), 1 (medium), 2 (high), 3 (very high). Ordinal classification is different from a standard, non-ordinal classification problem where the set of values to predict is categorical and is not ordered. For example, predicting the exterior color of a car, where 0 = white,1 = silver, 2 = black and so on is a standard classification problem.

There are a surprisingly large number of techniques for ordinal classification. My article presents a relatively simple technique that I’ve used with good success. To the best of my knowledge the technique has not been published and does not have a standard name. Briefly, the idea is to programmatically convert ordinal labels such as (0, 1, 2, 3) to floating point targets such as (0.125, 0.375, 0.625, 0.875) and then use a neural network regression approach.

I explained using a specific demo example. The demo predicted the price of a house (0 = low, 1 = medium, 2 = high, 3 = very high) based on predictor variables (air conditioning, -1 = no, +1 = yes), normalized area in square feet (e.g., 0.2500 = 2,500 sq. feet), style (art_deco, bungalow, colonial) and local school (johnson, kennedy, lincoln).

The demo loaded a 200-item set of training data, and a 40-item set of test data into memory. During the loading process, the ordinal class labels (0, 1, 2, 3) were converted to float targets (0.125, 0.375, 0.625, 0.875). The mapping of ordinal labels to float targets is the key to the ordinal classification tecnique.

The demo created an 8-(10-10)-1 deep neural network. There are 8 input nodes (one for each predictor value after encoding house style and local school), two hidden layers with 10 processing nodes each and a single output node. The neural network emits an output value in the range 0.0 to 1.0, that corresponds to the float targets.

After training, the model achieved a prediction accuracy of 89.5 percent on the training data (179 of 200 correct) and 82.5 percent accuracy on the test data (33 of 40 correct). The demo concluded by making a prediction for a new, previously unseen house. The predictor values were air conditioning = no (-1), area = 0.2300 (2300 square feet), style = colonial (0, 0, 1) and local school = kennedy (0, 1, 0). The raw computed output value was 0.7215 which mapped to class label 2, which corresponded to an ordinal price of “high”.

The novel “Adventures of Huckleberry Finn” was published in 1885. It is almost universally regarded as one of the greatest American novels ever published — an ordinal rating of 5 on a scale of 1-5. Reading the book is required by nearly 100% of the top-rated high schools in the U.S. I read “Huck Finn” in high school and it was easily one of my favorite books.

This entry was posted in PyTorch. Bookmark the permalink.