Implementing a Neural Network Using Raw JavaScript

Quite some time ago I implemented a neural network multi-class classifier using raw JavaScript. The implementation had a single hidden layer of nodes, but even so, the implementation took many days to complete. See https://jamesmccaffrey.wordpress.com/2022/01/24/a-neural-network-with-raw-javascript/.

My old implementation used one-hot encoding for predictor variables, softmax output node activation, and mean squared error based optimization. But recently, on a cold and dreary Pacific Northwest weekend, in a moment of temporary insanity, I decided to update my old JavaScript code to mirror common PyTorch techniques: ordinal encoding for predictor variables, log-softmax output node activation, and negative log loss (aka cross entropy) based optimization. Many hours later, I got a revised version of the JavaScript neural network running.

I used one of my standard multi-class datasets where the goal is to predict a person’s political leaning (conservative = 0, moderate = 1, liberal = 2) from sex (M = -1, F = +1), age (divided by 100), state (Michigan = 1 0 0, Nebraska = 0 1 0, Oklahoma = 0 0 1), and income (divided by $100,000). The data looks like:

 1  0.24  1  0  0  0.2950  2
-1  0.39  0  0  1  0.5120  1
 1  0.63  0  1  0  0.7580  0
-1  0.36  1  0  0  0.4450  1
. . . 

The JavaScript neural network program (along with the supporting Utility library of functions) is hundreds of lines of code so I won’t present it all in this blog post. The program main() function starts with:

let U = require("../Utilities/utilities_lib.js");
let FS = require("fs");

function main()
{
  process.stdout.write("\033[0m");  // reset
  process.stdout.write("\x1b[1m" + "\x1b[37m");  // white
  console.log("\nBegin People Data demo with JavaScript ");

  // 1. load data
  // raw data looks like:   M  32   michigan  52,000.00  lib
  // norm data looks like: -1  0.32  1 0 0     0.5250     2
  let trainX = U.loadTxt(".\\Data\\people_train.txt",
    "\t", [0,1,2,3,4,5]);
  let trainY = U.loadTxt(".\\Data\\people_train.txt",
    "\t", [6]);
  let testX = U.loadTxt(".\\Data\\people_test.txt",
    "\t", [0,1,2,3,4,5]);
  let testY = U.loadTxt(".\\Data\\people_test.txt",
    "\t", [6]);

  // 2. create network
  console.log("\nCreating 6-25-3 tanh, log-softmax NN ");
  let seed = 0;
  let nn = new NeuralNet(6, 25, 3, seed);

  // 3. train network
  let lrnRate = 0.01;
  let maxEpochs = 5000;
  console.log("Starting training lrn rate = 0.01 ");
  nn.train(trainX, trainY, lrnRate, maxEpochs);
  console.log("Training complete");

  . . .

Ultimately, my JavaScript exploration was essentially nothing more than weekend mental exercise and a way to practice my JavaScript skills. It was a lot of work but satisfying.



Most of the guys I work with love what they do and so we often write code on weekends — because we want to, not because we have to. I suspect that artists paint and draw even when they don’t have to, because they love their work. Here are examples of three of my favorite comic book artists of the 1960s. Left: A “The Atom” cover by Gil Kane (1926-2000). Center: A “Superman” cover by Curt Swan (1920-1996). Right: A “The Flash” cover by Carmine Infantino (1925-2013).


This entry was posted in JavaScript. Bookmark the permalink.

1 Response to Implementing a Neural Network Using Raw JavaScript

  1. Pingback: Implementing a Neural Network Using Raw JavaScript – Jyrone Parker

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s