## Decision Tree Classification

I’m not a big fan of decision tree classification. Although decision trees have a couple of advantages over neural network classifiers (simplicity, somewhat able to interpret), decision trees rarely work as well as neural networks, at least on the types of problems I deal with.

Here’s an example of a decision tree for the well-known Iris Dataset, using the scikit-learn library.

# iris_tree.py

import numpy as np
from sklearn import datasets
from sklearn import tree

def show_iris_tree(tree):
# https://www.kdnuggets.com/2017/05/simplifying-decision-
# tree-interpretation-decision-rules-python.html
print("")
left = tree.tree_.children_left
right = tree.tree_.children_right
thresh = tree.tree_.threshold
features = ['sepal-len', 'sepal-len', 'sepal-wid', 'sepal-wid',
'petal-len', 'petal-len', 'petal-wid', 'petal-wid']
value = tree.tree_.value

def process(left, right, thresh, features, node, depth=0):
indent = "  " * depth
if (thresh[node] != -2):
print(indent, end="")
print("if ( %s <= %0.4f ) {" % (features[node],\
thresh[node]))
if left[node] != -1:
process(left, right, thresh, features, left[node],\
depth+1)
print( indent,"} else {" )
if right[node] != -1:
process(left, right, thresh, features, right[node],\
depth+1)
print( indent,"}" )
else:
print( indent,"return " + str(value[node]) )

process(left, right, thresh, features, 0)

# ==============================================================

print("\nBegin Iris decision tree example \n")

X = iris.data
y = iris.target

print("Creating decision tree max_depth=3")
tr = tree.DecisionTreeClassifier(max_depth=3)
tr.fit(X, y)
print("Done")

show_iris_tree(tr)

print("\nEnd decision tree demo ")

So, no real moral to the story, just a little investigation during my lunch break.

“Olive Trees with Yellow Sky and Sun” (1889), Vincent Van Gogh