I was looking at k-NN classification using the scikit library recently. While wading through the scikit documentation, I noticed that scikit has a closely related Radius Neighbors classifier module. When I did my k-NN example, I used the Wheat Seeds dataset where the goal is to predict the species of a wheat seed (0 = Kama, 1 = Rosa, 2 = Canadian) from seven predictor variables: seed length, width, perimeter, and so on.
The most difficult part of the k-NN experiment was preparing the training and test data. Because I’d already done all the data preparation and had data ready, I figured I’d apply the radius neighbors classifier to the prepared wheat seed data.
In k-NN the unknown input to classify is compared to the k closest labeled data items in the training data (using Euclidean distance), and the most common class label is the prediction. For example, if k = 5, and the five closest data items to the item to predict have class labels (2, 0, 2, 2, 1), then the predicted class label is 2.
In radius neighbors classification, instead of specifying k data points to compare, you specify a radius. All the labeled data items with the radius of the item to classify are examined and the most common label is the prediction.
The raw Wheat Seeds data came from archive.ics.uci.edu/ml/datasets/seeds and looks like:
15.26 14.84 0.871 5.763 3.312 2.221 5.22 1 14.88 14.57 0.8811 5.554 3.333 1.018 4.956 1 . . . 17.63 15.98 0.8673 6.191 3.561 4.076 6.06 2 16.84 15.67 0.8623 5.998 3.484 4.675 5.877 2 . . . 11.84 13.21 0.8521 5.175 2.836 3.598 5.044 3 12.3 13.34 0.8684 5.243 2.974 5.637 5.063 3
There are 210 data items. Each represents one of three species of wheat seeds: Kama, Rosa, Canadian. There are 70 of each species. The first 7 values on each line are the predictors: area, perimeter, compactness, length, width, asymmetry, groove. The eighth value in the raw data is the one-based encoded species. The goal is to predict species from the seven predictor values.
When using any of the neighbors classification techniques, it’s important to normalize the numeric predictors so that they all have roughly the same magnitude so that a predictor with large values doesn’t overwhelm other predictor values. As is often the case in machine learning, data preparation takes most of the time an effort of any exploration.
I dropped the raw data into an Excel spreadsheet. For each predictor, I computed the min and max values of the column. Then I performed min-max normalization where each value x in a column is normalized to x’ = (x – min) / (max – min). The result is that each predictor is a value between 0.0 and 1.0.
I recoded the target class labels from one-based to zero-based. The resulting 210-item dataset looks like:
0.4410 0.5021 0.5708 0.4865 0.4861 0.1893 0.3452 0 0.4051 0.4463 0.6624 0.3688 0.5011 0.0329 0.2152 0 . . . 0.6648 0.7376 0.5372 0.7275 0.6636 0.4305 0.7587 1 0.5902 0.6736 0.4918 0.6188 0.6087 0.5084 0.6686 1 . . . 0.1917 0.2603 0.3630 0.2877 0.2003 0.3304 0.3506 2 0.2049 0.2004 0.8013 0.0980 0.3742 0.2682 0.1531 2
I split the 210-item normalized data into a 180-item training set and a 30-item test set. I used the first 60 of each target class for training and the last 10 of each target class for testing. Put another way, for the training data, items [0] to [59] are class 0, items [60] to [119] are class 1, and items [120] to [179] are class 2.
Using scikit is easy. After loading the training and test data into memory, a radius neighbors multi-class classification model is created and trained like so:
import numpy as np from sklearn.neighbors import RadiusNeighborsClassifier . . . rad = 0.38 print("Creating k-RN model, with radius = %0.2f " % rad) model = RadiusNeighborsClassifier(radius=rad, algorithm='brute') model.fit(train_X, train_y) print("Done ")
The hard part is determining the radius value. The default value is 1.0 but that radius was too big and it included virtually all of the training data. When I used a radius of 0.2, none of the labeled items were within that radius. I set up a dummy input to predict of all 0.5 values. With radius = 0.38 the prediction is [0.65, 0.35, 0.00]. Because the largest value is at index [0], the predicted wheat seed species is 0 = Kama.
I analyzed the results of the prediction using the radius_neighbors() method:
print("The idxs of neighbors within %0.2f are: " % rad) (dists, idxs) = model.radius_neighbors(X) np.set_printoptions(linewidth=40) print(idxs)
This code gave me the indexes of the labeled training items that are within 0.38 of the input X = [0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5]:
[ 0, 6, 10, 20, 31, 32, 34, 36, 38, 43, 49, 50, 51, 52, 55, 61, 62, 64, 65, 66, 70, 90, 112]
There are 23 data items. Recall that the 15 items at indexes 0, 6, 10, 20, 31, 32, 34, 36, 38, 43, 49, 50, 51, 52, 55 are class 0, and the 8 items at indexes 61, 62, 64, 65, 66, 70, 90, 112 are class 1, and none of the class 2 items at indexes 120 to 179 appear.
Therefore, the probability of class 0 is 15 / 23 = 0.65, and the probability of class 1 is 8 / 23 = 0.35, and the probability of class 2 is 0 / 23 = 0.00.
Radius neighbors classification isn’t used very often. In practice, it’s a bit too difficult to specify the radius value.
The key to the radius neighbors classification algorithm is the concept of a radius. Most people, including me, first come into contact with radius in a geometry class: circles and spheres. Three memorable spherical spaceships from science fiction movies. Left: The alien ship from “It Came From Outer Space” (1953). The ship crashed on Earth and the (good) aliens impersonated townspeople to get supplies to repair their craft. Center: The Aries 1B from “2001: A Space Odyssey” (1968) is a shuttle for travel between a space station in Earth orbit and the Moon. Right: The Heart of Gold from “The Hitchhiker’s Guide to the Galaxy” (2005).
Demo code:
# wheat_krn.py # radius neighbor version of k-NN # predict wheat seed species (0=Kama, 1=Rosa, 2=Canadian) # from area, perimeter, compactness, length, width, # asymmetry, groove # Anaconda3-2020.02 Python 3.7.6 scikit 0.22.1 # Windows 10/11 import numpy as np from sklearn.neighbors import RadiusNeighborsClassifier # --------------------------------------------------------- def show_confusion(cm): dim = len(cm) mx = np.max(cm) # largest count in cm wid = len(str(mx)) + 1 # width to print fmt = "%" + str(wid) + "d" # like "%3d" for i in range(dim): print("actual ", end="") print("%3d:" % i, end="") for j in range(dim): print(fmt % cm[i][j], end="") print("") print("------------") print("predicted ", end="") for j in range(dim): print(fmt % j, end="") print("") # --------------------------------------------------------- def main(): # 0. prepare print("\nBegin Wheat Seeds radius neighbors using scikit ") np.set_printoptions(precision=4, suppress=True) np.random.seed(1) # 1. load data # 0.4410 0.5021 0.5708 0.4865 0.4861 0.1893 0.3452 0 # 0.4051 0.4463 0.6624 0.3688 0.5011 0.0329 0.2152 0 # . . . # 0.1917 0.2603 0.3630 0.2877 0.2003 0.3304 0.3506 2 # 0.2049 0.2004 0.8013 0.0980 0.3742 0.2682 0.1531 2 print("\nLoading train and test data ") train_file = ".\\Data\\wheat_train.txt" # 180 items train_X = np.loadtxt(train_file, usecols=[0,1,2,3,4,5,6], delimiter="\t", dtype=np.float32, comments="#") train_y = np.loadtxt(train_file, usecols=[7], delimiter="\t", dtype=np.int64, comments="#") test_file = ".\\Data\\wheat_test.txt" # 30 items test_X = np.loadtxt(test_file, usecols=[0,1,2,3,4,5,6], delimiter="\t", dtype=np.float32, comments="#") test_y = np.loadtxt(test_file, usecols=[7], delimiter="\t", dtype=np.int64, comments="#") print("\nTraining data:") print(train_X[0:4]) print(". . . \n") print(train_y[0:4]) print(". . . ") # 2. create and train model # RadiusNeighborsClassifier(radius=1.0, *, weights='uniform', # algorithm='auto', leaf_size=30, p=2, metric='minkowski', # outlier_label=None, metric_params=None, n_jobs=None) # algorithm: 'ball_tree', 'kd_tree', 'brute', 'auto'. rad = 0.38 print("\nCreating k-RN model, with radius = %0.2f " % rad) model = RadiusNeighborsClassifier(radius=rad, algorithm='brute') model.fit(train_X, train_y) print("Done ") # 3. evaluate model train_acc = model.score(train_X, train_y) test_acc= model.score(test_X, test_y) print("\nAccuracy on train data = %0.4f " % train_acc) print("Accuracy on test data = %0.4f " % test_acc) from sklearn.metrics import confusion_matrix y_predicteds = model.predict(test_X) cm = confusion_matrix(test_y, y_predicteds) print("\nConfusion matrix raw: \n") # print(cm) show_confusion(cm) # custom formatted # 4. use model X = np.array([[0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5]], dtype=np.float32) print("\nPredicting wheat species for: ") print(X) probs = model.predict_proba(X) print("\nPrediction probs: ") print(probs) print("\nThe idxs of neighbors within %0.2f are: " % rad) (dists, idxs) = model.radius_neighbors(X) np.set_printoptions(linewidth=40) print(idxs) predicted = model.predict(X) print("\nPredicted class: ") print(predicted) # 5. TODO: save model using pickle print("\nEnd demo ") if __name__ == "__main__": main()
Training data. Replace commas with tabs or modify program code.
# wheat_train.txt # # http://archive.ics.uci.edu/ml/datasets/seeds # 210 total items. train is first 60 each of 3 classes # 180 training, 30 test # area, perimeter, compactness, length, width, asymmetry, groove # predictors are all min-max normalized # 0 = Kama, 1 = Rosa, 2 = Canadian # 0.4410,0.5021,0.5708,0.4865,0.4861,0.1893,0.3452,0 0.4051,0.4463,0.6624,0.3688,0.5011,0.0329,0.2152,0 0.3494,0.3471,0.8793,0.2207,0.5039,0.2515,0.1507,0 0.3069,0.3161,0.7931,0.2393,0.5339,0.1942,0.1408,0 0.5241,0.5331,0.8648,0.4274,0.6643,0.0767,0.3230,0 0.3579,0.3719,0.7895,0.2742,0.4861,0.2206,0.2152,0 0.3872,0.4298,0.6515,0.3739,0.4483,0.3668,0.3447,0 0.3324,0.3492,0.7532,0.2934,0.4790,0.2516,0.2368,0 0.5703,0.6302,0.6044,0.6498,0.5952,0.1658,0.6686,0 0.5524,0.5868,0.7250,0.5546,0.6237,0.1565,0.4993,0 0.4410,0.5041,0.5581,0.4589,0.4362,0.4912,0.3914,0 0.3248,0.3616,0.6488,0.3035,0.4070,0.1238,0.2373,0 0.3116,0.3326,0.7250,0.3041,0.4056,0.4188,0.1078,0 0.3012,0.3409,0.6152,0.3266,0.3749,0.3083,0.1738,0 0.2975,0.3388,0.6016,0.3283,0.3450,0.2817,0.1507,0 0.3777,0.3864,0.8276,0.2545,0.5011,0.4447,0.1290,0 0.3211,0.2934,1.0000,0.1239,0.5367,0.5811,0.1290,0 0.4816,0.4835,0.8866,0.3536,0.6301,0.1084,0.2595,0 0.3881,0.3719,0.9728,0.1723,0.5959,0.1303,0.0640,0 0.2011,0.2397,0.5490,0.1841,0.2986,0.4339,0.1945,0 0.3371,0.4112,0.4564,0.4274,0.3557,0.3000,0.3235,0 0.3324,0.3822,0.5817,0.3497,0.3835,0.2500,0.3447,0 0.4995,0.5145,0.8230,0.4048,0.6251,0.0000,0.2816,0 0.1407,0.1694,0.5290,0.1126,0.2181,0.0845,0.2176,0 0.4174,0.4855,0.5227,0.5011,0.4383,0.1334,0.2373,0 0.5288,0.5682,0.6969,0.5259,0.5638,0.0179,0.3880,0 0.2295,0.2789,0.5082,0.2793,0.2823,0.3391,0.1507,0 0.2030,0.2603,0.4383,0.2793,0.2324,0.2261,0.1723,0 0.3324,0.3657,0.6706,0.3615,0.4212,0.2586,0.2555,0 0.2701,0.3326,0.4746,0.3474,0.3100,0.3596,0.2846,0 0.2427,0.2913,0.5272,0.3125,0.2459,0.0117,0.2644,0 0.4627,0.5227,0.5835,0.4831,0.5282,0.3442,0.3491,0 0.3305,0.4132,0.4065,0.4606,0.3963,0.4102,0.3840,0 0.3163,0.3636,0.5871,0.3863,0.3706,0.1767,0.2427,0 0.4212,0.4690,0.6334,0.4578,0.4975,0.1773,0.4141,0 0.5222,0.5351,0.8339,0.4561,0.6094,0.1957,0.4549,0 0.5297,0.5909,0.5926,0.5220,0.5944,0.2676,0.4963,0 0.6128,0.6136,0.9056,0.5253,0.7505,0.2849,0.4751,0 0.3975,0.4360,0.6733,0.4262,0.4690,0.3052,0.3890,0 0.3484,0.3636,0.7831,0.2804,0.4761,0.7697,0.2373,0 0.2786,0.2975,0.7169,0.2528,0.3749,0.2369,0.3245,0 0.2748,0.2975,0.6996,0.2545,0.3763,0.1929,0.3235,0 0.2427,0.2355,0.8421,0.1346,0.4070,0.2205,0.1300,0 0.4636,0.5062,0.6706,0.5507,0.5460,0.5131,0.4968,0 0.4268,0.4401,0.8212,0.3829,0.5930,0.3072,0.3255,0 0.3031,0.3368,0.6470,0.2686,0.3742,0.1034,0.2176,0 0.4504,0.4855,0.7078,0.4516,0.5438,0.0783,0.3018,0 0.4155,0.4442,0.7278,0.3778,0.5324,0.2851,0.3230,0 0.3966,0.4360,0.6697,0.3637,0.4711,0.2521,0.2915,0 0.4032,0.4669,0.5399,0.4386,0.4476,0.1773,0.4097,0 0.3626,0.4112,0.6080,0.3863,0.4576,0.4174,0.3077,0 0.4901,0.5165,0.7641,0.4364,0.5731,0.6277,0.3038,0 0.3683,0.4545,0.4147,0.4595,0.3443,0.4357,0.4318,0 0.3532,0.3864,0.6806,0.3407,0.4056,0.3332,0.3471,0 0.3711,0.4525,0.4319,0.4741,0.3443,0.0931,0.4766,0 0.4193,0.4876,0.5236,0.4521,0.4148,0.1519,0.4530,0 0.3654,0.4008,0.6688,0.2753,0.5324,0.2648,0.2585,0 0.4089,0.4174,0.8394,0.2731,0.5574,0.0490,0.2802,0 0.4523,0.4876,0.7042,0.4296,0.5624,0.1604,0.3461,0 0.1435,0.2190,0.2822,0.1464,0.2865,0.0958,0.0000,0 0.6648,0.7376,0.5372,0.7275,0.6636,0.4305,0.7587,1 0.5902,0.6736,0.4918,0.6188,0.6087,0.5084,0.6686,1 0.6298,0.6860,0.6189,0.6075,0.6871,0.4907,0.6263,1 0.8045,0.7955,0.9074,0.7066,0.9266,0.2823,0.7681,1 0.5883,0.6405,0.6397,0.6295,0.6101,0.4211,0.6509,1 0.5836,0.6632,0.5054,0.5788,0.5759,0.5402,0.6283,1 0.6355,0.7231,0.4701,0.6560,0.5510,0.3977,0.6908,1 0.9556,0.9959,0.6189,0.9459,0.8439,0.4793,0.9513,1 0.7885,0.8430,0.6071,0.8705,0.7192,0.5590,0.9074,1 0.6166,0.6488,0.7359,0.5355,0.6671,0.2721,0.6041,1 0.5609,0.6054,0.6733,0.5495,0.5966,0.6198,0.6701,1 0.7677,0.7810,0.8131,0.6233,0.8746,0.5928,0.6696,1 0.9075,0.9256,0.7377,0.7804,0.8795,0.5731,0.8213,1 0.8480,0.8946,0.6334,0.8361,0.8140,0.0919,0.8636,1 0.8423,0.8884,0.6343,0.8260,0.8346,0.2856,0.8203,1 0.7252,0.7603,0.7160,0.7173,0.7277,0.2182,0.8262,1 0.7828,0.7955,0.8058,0.6672,0.8083,0.1149,0.7829,1 0.7923,0.8781,0.4619,0.9291,0.7413,0.3804,0.9744,1 1.0000,0.9917,0.8240,0.9426,1.0000,0.6521,0.8429,1 0.9717,0.9587,0.8621,0.8733,0.9993,0.5527,0.8872,1 0.8980,0.9463,0.6034,0.9471,0.8232,0.1547,0.9503,1 0.7715,0.7831,0.8194,0.7168,0.8311,0.3062,0.7553,1 0.7762,0.8017,0.7486,0.7731,0.7577,0.3214,0.7553,1 0.7554,0.7521,0.8938,0.6408,0.8767,0.6808,0.6686,1 0.7337,0.8492,0.3367,0.9949,0.6094,0.5419,0.9498,1 0.5930,0.6694,0.5145,0.6982,0.5937,0.3811,0.7129,1 0.8234,0.8636,0.6661,0.8119,0.8411,0.3526,0.8464,1 0.7923,0.8595,0.5499,0.8727,0.6572,0.1793,0.9522,1 0.7158,0.7955,0.5045,0.7725,0.6287,0.2715,0.8636,1 0.7677,0.8120,0.6615,0.7432,0.7512,0.1850,0.7770,1 0.5496,0.5868,0.7123,0.4611,0.6379,0.4488,0.5411,1 0.6988,0.7128,0.8267,0.5580,0.7584,0.1694,0.6489,1 0.8376,0.8450,0.8203,0.6836,0.8995,0.4607,0.7336,1 0.8111,0.8719,0.5771,0.8277,0.7491,0.3370,0.8419,1 0.7894,0.8285,0.6788,0.7596,0.8019,0.3384,0.8021,1 0.7781,0.8017,0.7586,0.6408,0.8239,0.2325,0.6696,1 0.7800,0.7769,0.8848,0.7055,0.8382,0.2702,0.8277,1 0.6648,0.7128,0.6525,0.6385,0.6721,0.3877,0.6942,1 0.8829,0.9318,0.6089,1.0000,0.8076,0.3234,1.0000,1 0.7517,0.7872,0.7114,0.7061,0.7441,0.1265,0.6770,1 0.7422,0.7665,0.7623,0.6802,0.8118,0.1911,0.6278,1 0.8300,0.8905,0.5762,0.7905,0.8275,0.3787,0.7120,1 0.8064,0.8058,0.8657,0.7230,0.9066,0.1747,0.6918,1 0.8074,0.8678,0.5817,0.7658,0.7890,0.7693,0.7553,1 0.9802,1.0000,0.7060,0.9369,0.9701,0.5086,0.8848,1 0.7998,0.8347,0.7015,0.8542,0.7762,0.1928,0.8095,1 0.7904,0.7831,0.9038,0.6486,0.9031,0.4640,0.6061,1 0.8083,0.8347,0.7341,0.7579,0.8446,0.3015,0.8203,1 0.7838,0.7893,0.8412,0.7477,0.8118,0.3737,0.7125,1 0.8914,0.9277,0.6624,0.8975,0.8746,0.2988,0.8868,1 0.9112,0.9298,0.7405,0.7973,0.9494,0.6678,0.8218,1 0.7129,0.7665,0.6270,0.6532,0.6650,0.3711,0.7346,1 0.5269,0.6136,0.4601,0.4859,0.5396,0.4578,0.5830,1 0.7403,0.7355,0.9038,0.6087,0.8133,0.2885,0.6824,1 0.5099,0.5124,0.8920,0.2613,0.6785,0.3343,0.3077,1 0.7705,0.7789,0.8330,0.6824,0.8831,0.4451,0.7253,1 0.7611,0.8264,0.5599,0.7804,0.6871,0.4715,0.7794,1 0.6978,0.7107,0.8276,0.6081,0.7534,0.1940,0.6893,1 0.9037,0.9545,0.5935,0.9088,0.8147,0.1489,0.8203,1 0.6572,0.6715,0.8258,0.5023,0.7555,0.5982,0.5623,1 0.2342,0.3120,0.3621,0.3226,0.2594,0.5902,0.4313,2 0.2578,0.3161,0.4828,0.3615,0.3158,0.8152,0.4535,2 0.2597,0.3182,0.4891,0.2759,0.3165,0.6800,0.3880,2 0.1539,0.1880,0.5181,0.1830,0.2402,0.6116,0.3456,2 0.1161,0.2045,0.1751,0.2337,0.1048,0.4819,0.3245,2 0.0585,0.1488,0.0780,0.2140,0.0406,0.7026,0.3722,2 0.0793,0.1488,0.2305,0.1560,0.0634,0.1893,0.3018,2 0.1794,0.2169,0.5236,0.2072,0.2402,0.4754,0.2378,2 0.1992,0.2686,0.3721,0.2742,0.2003,0.3244,0.3924,2 0.0189,0.1074,0.0236,0.2354,0.0128,0.6107,0.3323,2 0.1171,0.1694,0.3766,0.2050,0.1497,0.5760,0.3880,2 0.1341,0.2293,0.1525,0.2849,0.1041,0.8096,0.3698,2 0.1577,0.2459,0.2287,0.2866,0.1447,0.5189,0.4141,2 0.0557,0.1302,0.1679,0.1807,0.0449,0.3338,0.2373,2 0.0727,0.1322,0.2731,0.1554,0.0891,0.4269,0.3663,2 0.0567,0.1322,0.1561,0.1976,0.0321,0.6563,0.3447,2 0.0708,0.0950,0.4673,0.0867,0.1561,0.3357,0.2383,2 0.1454,0.2727,0.0000,0.2787,0.0820,0.5279,0.3452,2 0.1095,0.2293,0.0009,0.3069,0.0342,0.4698,0.3895,2 0.0850,0.1674,0.1652,0.2280,0.0463,0.6011,0.3895,2 0.1841,0.2603,0.3122,0.3108,0.1775,0.3013,0.4786,2 0.1350,0.1901,0.3829,0.2539,0.1283,0.4559,0.3885,2 0.1379,0.2066,0.3040,0.2072,0.1547,0.5491,0.2595,2 0.1851,0.2397,0.4328,0.2444,0.2409,0.4751,0.3235,2 0.0519,0.0785,0.4328,0.0631,0.1169,0.7311,0.2610,2 0.1426,0.1529,0.6461,0.1160,0.2217,0.1867,0.2644,2 0.1747,0.2438,0.3457,0.2365,0.1903,0.5408,0.3698,2 0.1473,0.2149,0.3285,0.2917,0.1475,0.3735,0.4032,2 0.0718,0.1467,0.1906,0.1560,0.0271,0.4644,0.3018,2 0.0614,0.1219,0.2523,0.1075,0.0606,0.3583,0.2802,2 0.0406,0.1219,0.0980,0.2399,0.0506,0.7762,0.3171,2 0.0907,0.1426,0.3394,0.1509,0.1532,0.7736,0.2152,2 0.0642,0.1157,0.3067,0.1064,0.0948,0.4608,0.2368,2 0.0765,0.1384,0.2668,0.1334,0.0948,0.6271,0.2806,2 0.0227,0.1136,0.0163,0.2134,0.0078,0.5743,0.3279,2 0.0198,0.0331,0.4619,0.0462,0.1361,0.5211,0.2678,2 0.0633,0.1240,0.2486,0.1616,0.0570,0.5942,0.2821,2 0.0142,0.0661,0.2250,0.1385,0.0086,0.5119,0.2186,2 0.0840,0.1322,0.3557,0.1582,0.0912,0.6645,0.2378,2 0.1530,0.2190,0.3376,0.2579,0.1875,0.1165,0.3245,2 0.0774,0.1116,0.4347,0.1075,0.1033,0.5450,0.1507,2 0.1766,0.2066,0.5672,0.1898,0.2758,0.5489,0.3092,2 0.1511,0.1963,0.4519,0.1920,0.1989,0.5320,0.3146,2 0.1001,0.1364,0.4483,0.1177,0.1568,0.5778,0.3033,2 0.2172,0.2810,0.4174,0.3356,0.2823,0.7047,0.3924,2 0.0916,0.1860,0.1062,0.2613,0.0378,0.4287,0.3264,2 0.1152,0.2149,0.1062,0.2894,0.0613,0.5374,0.4101,2 0.0302,0.0806,0.2641,0.1064,0.0321,0.4439,0.2152,2 0.0604,0.0847,0.4655,0.1070,0.1361,0.8788,0.2157,2 0.0000,0.0000,0.5145,0.0000,0.1119,0.5474,0.1354,2 0.0321,0.0806,0.2804,0.0828,0.0620,0.6024,0.2590,2 0.0642,0.0930,0.4374,0.1081,0.1240,0.4187,0.2373,2 0.1209,0.1260,0.6479,0.1312,0.2302,0.3682,0.3018,2 0.0217,0.0868,0.1588,0.1582,0.0000,0.5315,0.2806,2 0.1435,0.1777,0.5064,0.1898,0.2459,0.4378,0.2427,2 0.2087,0.2190,0.7069,0.1470,0.3535,0.5341,0.1945,2 0.2077,0.2314,0.6397,0.1830,0.3022,0.6134,0.2161,2 0.2625,0.2831,0.6969,0.2370,0.3550,0.5077,0.2816,2 0.1917,0.2603,0.3630,0.2877,0.2003,0.3304,0.3506,2 0.2049,0.2004,0.8013,0.0980,0.3742,0.2682,0.1531,2
Test data:
# wheat_test.txt # 0.0784,0.0930,0.5463,0.0614,0.1568,0.2516,0.0433,0 0.0604,0.0455,0.6887,0.0017,0.1775,0.1955,0.0906,0 0.1671,0.1612,0.7641,0.0997,0.2937,0.3192,0.0423,0 0.2483,0.2955,0.5436,0.2793,0.3136,0.4410,0.2802,0 0.2068,0.2397,0.5762,0.2044,0.2823,0.0534,0.1295,0 0.2162,0.2252,0.7241,0.1351,0.3485,0.2063,0.0433,0 0.3541,0.4050,0.5853,0.4116,0.3991,0.0712,0.3107,0 0.3229,0.3884,0.4936,0.3998,0.3763,0.1888,0.3018,0 0.3569,0.4091,0.5853,0.3773,0.3728,0.0909,0.3845,0 0.2021,0.2769,0.3421,0.2889,0.1796,0.3599,0.2698,0 0.7280,0.7190,0.9319,0.6081,0.8019,0.2694,0.7105,1 0.7885,0.8079,0.7813,0.7010,0.8517,0.2786,0.7041,1 0.4523,0.5145,0.5672,0.5546,0.4547,0.4807,0.6283,1 0.5260,0.6033,0.5109,0.5327,0.5453,0.4552,0.6283,1 0.4693,0.5124,0.6733,0.4938,0.5545,0.5470,0.6539,1 0.4523,0.4649,0.8249,0.3255,0.5952,0.3686,0.4530,1 0.6393,0.6921,0.6388,0.7016,0.6728,0.3590,0.7149,1 0.4703,0.5661,0.4047,0.5749,0.4284,0.2438,0.6696,1 0.4731,0.5579,0.4528,0.5253,0.4676,0.2548,0.6071,1 0.5326,0.5723,0.6978,0.5479,0.6001,0.3906,0.6908,1 0.1690,0.2128,0.4791,0.1802,0.2559,0.6120,0.2590,2 0.1964,0.1880,0.8131,0.0479,0.3599,0.1996,0.1113,2 0.0557,0.0640,0.5436,0.0619,0.1283,0.4272,0.1521,2 0.1992,0.2066,0.7196,0.1599,0.3286,1.0000,0.2368,2 0.1681,0.2190,0.4410,0.1717,0.2352,0.4101,0.2373,2 0.1511,0.1632,0.6370,0.1340,0.2502,0.3726,0.1728,2 0.0604,0.0971,0.3902,0.1357,0.1176,0.4629,0.2383,2 0.2465,0.2583,0.7278,0.1898,0.4291,0.9817,0.2644,2 0.1180,0.1653,0.3993,0.1554,0.1468,0.3683,0.2585,2 0.1615,0.1921,0.5472,0.1937,0.2452,0.6335,0.2678,2
You must be logged in to post a comment.