Author Archives: jamesdmccaffrey

A Quick Demo of the DBSCAN Clustering Algorithm

I was reading a research paper this morning and the paper used the DBSCAN (“density-based spatial clustering of applications with noise”) clustering algorithm. DBSCAN is somewhat similar to k-means clustering. Both work only with strictly numeric data. In k-means you … Continue reading

Posted in Machine Learning | Leave a comment

Differential Evolution Optimization in Visual Studio Magazine

I wrote an article titled “Differential Evolution Optimization” in the September 2021 edition of the Microsoft Visual Studio Magazine. See https://visualstudiomagazine.com/articles/2021/09/07/differential-evolution-optimization.aspx. The most common type of optimization for neural network training is some form of stochastic gradient descent (SGD). SGD … Continue reading

Posted in Machine Learning | Leave a comment

Yet Another MNIST Example Using Keras

It’s a major challenge to keep up with the continuous changes to the Keras/TensorFlow neural code library (and the PyTorch library too). I recently upgraded my Keras installation to version 2.6 and so I’m going through all my standard examples … Continue reading

Posted in Keras | Leave a comment

NFL 2021 Week 2 Predictions – Zoltar Likes Eight Vegas Underdogs

Zoltar is my NFL football prediction computer program. It uses reinforcement learning and a neural network. Here are Zoltar’s predictions for week #2 of the 2021 season. These predictions are tentative, in the sense that it usually takes Zoltar about … Continue reading

Posted in Zoltar | 1 Comment

Determining If Two Sentences Are Paraphrases Of Each Other Using Hugging Face

Deep neural systems based on Transformer Architecture (TA) have revolutionized the field of natural language processing (NLP). Unfortunately, TA systems are insanely complex, meaning that implementing a TA system from scratch is not feasible, and implementing TA using a low-level … Continue reading

Posted in PyTorch | Leave a comment

A Simplified Approach for Ordinal Classification

In a standard classification problem, the goal is to predict a class label. For example, in the Iris Dataset problem, the goal is to predict a species of flower: 0 = “setosa”, 1 = “versicolor”, 2 = “virginica”. Here the … Continue reading

Posted in PyTorch | Leave a comment

Example of Computing Kullback-Leibler Divergence for Continuous Distributions

In this post, I present an example of estimating the Kullback-Leibler (KL) divergence between two continuous distributions using the Monte Carlo technique. Whoa! Just stating the problem has a massive amount of information. The KL divergence is the key part … Continue reading

Posted in Machine Learning, PyTorch | Leave a comment

The Wasserstein Distance Using C#

The Wasserstein distance has many different variations. In its simplest form the Wasserstein distance function measures the distance between two discrete probability distributions For example, if: double[] P = new double[] { 0.6, 0.1, 0.1, 0.1, 0.1 }; double[] Q1 … Continue reading

Posted in Machine Learning | 1 Comment

NFL 2021 Week 1 Predictions – Zoltar Likes Six Underdogs

Zoltar is my NFL football prediction computer program. It uses reinforcement learning and a neural network. Here are Zoltar’s predictions for week #1 of the 2021 season. These predictions are tentative, in the sense that it usually takes Zoltar about … Continue reading

Posted in Zoltar | Leave a comment

A Recap of the 2021 National Homeland Security Conference

I gave a short talk at the 2021 National Homeland Security Conference. See https://www.nationalhomelandsecurity.org/. The event was held from August 30 through September 2. The event Web site states, “The National Homeland Security Conference brings together professionals in Homeland Security, … Continue reading

Posted in Conferences | Leave a comment