The R language has many hundreds (at least — probably thousands) of functions that can perform classical statistics tests and analyses. The R language also has functions that can perform machine learning tasks.
The nnet (neural network) package used to be an add-on package but now comes with the base installation. I decided to give nnet a quick look.
Interestingly, I found very few online resources for the nnet package which suggests to me that the package isn’t used very much. Based on my brief experience, I’d say the strength of the nnet package is that it’s tightly coupled to R language features such as data frames. The weakness of nnet in my opinion is that it doesn’t work like most common neural network libraries. For example, I believe nnet uses BFGS optimization rather than back-propagation.
I wrote a demo that followed the two examples in the nnet documentation. First, I created a custom data frame named irisdf using the built-in “iris” data frame that comes with nnet:
col1 = iris$Sepal.Length col2 = iris$Sepal.Width col3 = iris$Petal.Length col4 = iris$Petal.Width col5 = factor(c(rep("s",50), rep("c", 50), rep("v", 50))) irisdf = data.frame(col1, col2, col3, col4, col5) names(irisdf) = c("SepLen", "SepWid", "PetLen", "PetWid", "Species")
Basically, I tear apart the built-in iris data frame and then recreate it so that the Species column has factors of s, c, v instead of text labels “setosa”, “versicolor”, “virginica”.
Then I created and trained a neural network with 2 hidden nodes, using 75 of the 150 data items:
samp = c(sample(1:50,25), sample(51:100,25), sample(101:150,25)) mynn = nnet(Species ~ SepLen + SepWid + PetLen + PetWid, data=irisdf, subset=samp, size=2, decay=5e-4, maxit=100)
Then I used the table() and predict() functions to generate a confusion matrix of predictions for the data that wasn’t used for training:
table(irisdf$Species[-samp], predict(mynn, irisdf[-samp,], type = "class"))
All in all, it was an interesting exploration. The bottom line for me is that there’s always a trade-off between trying to learn a complex library written by someone else and just writing your own library from scratch.