Category Archives: PyTorch

A Sentence Fill-in-The-Blank Example Using Hugging Face

Deep neural transformer architecture (TA) systems have revolutionized the field of natural language processing (NLP). Unfortunately, TA systems are incredibly complex and implementing such a system from scratch can take months. Enter the Hugging Face code library. Terrible name, excellent … Continue reading

Posted in Machine Learning, PyTorch | Leave a comment

Computing the Similarity Between Two Machine Learning Datasets in Visual Studio Magazine

I wrote an article titled “Computing the Similarity Between Two Machine Learning Datasets” in the September 2021 edition of Microsoft Visual Studio Magazine. See https://visualstudiomagazine.com/articles/2021/09/20/dataset-similarity.aspx. A common task in many machine learning scenarios is the need to compute the similarity … Continue reading

Posted in Machine Learning, PyTorch | Leave a comment

Finding Reliable Negatives For Positive and Unlabeled Learning (PUL) Datasets

Suppose you have a machine learning dataset for training, where only a few data items have a positive label (class = 1), but all the other data items are unlabeled and could be either negative (class = 0) or positive. … Continue reading

Posted in Machine Learning, PyTorch | Leave a comment

Natural Language Question-Answering Using Hugging Face

I’m currently on a multi-week mission to explore the Hugging Face (HF) code library for Transformer Architecture (TA) systems for natural language processing (NLP) and today I did a question-answer (QA) example. Whew! That’s a lot of acronyms in an … Continue reading

Posted in PyTorch | Leave a comment

Determining If Two Sentences Are Paraphrases Of Each Other Using Hugging Face

Deep neural systems based on Transformer Architecture (TA) have revolutionized the field of natural language processing (NLP). Unfortunately, TA systems are insanely complex, meaning that implementing a TA system from scratch is not feasible, and implementing TA using a low-level … Continue reading

Posted in PyTorch | Leave a comment

A Simplified Approach for Ordinal Classification

In a standard classification problem, the goal is to predict a class label. For example, in the Iris Dataset problem, the goal is to predict a species of flower: 0 = “setosa”, 1 = “versicolor”, 2 = “virginica”. Here the … Continue reading

Posted in PyTorch | Leave a comment

Example of Computing Kullback-Leibler Divergence for Continuous Distributions

In this post, I present an example of estimating the Kullback-Leibler (KL) divergence between two continuous distributions using the Monte Carlo technique. Whoa! Just stating the problem has a massive amount of information. The KL divergence is the key part … Continue reading

Posted in Machine Learning, PyTorch | Leave a comment

Example of a PyTorch Custom Layer

When I create neural software systems, I most often use the PyTorch library. The Keras library is very good for basic neural systems but for advanced architectures I like the flexibility of PyTorch. Using raw TensorFlow without Keras is an … Continue reading

Posted in PyTorch | Leave a comment

An Example of a Bayesian Neural Network Using PyTorch

A regular neural network has a set of numeric constants called weights which determine the network output. If you feed the same input to a regular trained neural network, you will get the same output every time. In a Bayesian … Continue reading

Posted in PyTorch | Leave a comment

Ordinal Classification for the Boston Housing Dataset Using PyTorch

Ordinal classification, also called ordinal regression, is a multi-class classification problem where the class labels to predict are ordered, for example, 0 = “poor”, 1 = “average”, 2 = “good”. You could just do normal classification, but then you don’t … Continue reading

Posted in PyTorch | Leave a comment