Regression Using PyTorch 1.12.1-CPU on MacOS

I use Windows OS machines for most of my work, but I also use MacOS machines and Linux machines too. I try to keep in practice with all three platforms, and so one morning, I figured I’d run the latest version of one of my standard PyTorch demo programs on my MacBook laptop.

Windows          Mac
Notepad          TextEdit
Ctrl-c           Command-c
PrtScn key       Shift-Command-3
File Explorer    Finder
Chrome           Safari
cmd              Terminal (bash)
  dir              ls
  md               mkdir
  cls              clear

I logged in to my MacBook and then opened a Terminal (bash) shell and I checked to make sure my existing Python 3.7.6 and PyTorch 1.12.1-CPU were installed and working correctly, and they were.

My demo was a regression problem. The goal of a regression problem is to predict a single numeric value. For my demo, the goal is to predict a person’s income, from sex (male = -1, female = +1), age, state (Michigan, Nebraska, Oklahoma), and political leaning (conservative, moderate, liberal). See the data and program for the Windows version at

I copied the training and test data from the page-link above and saved as people_train.txt and people_test.txt. The data looks like:

# sex age  state  income  politics
# -----------------------------
  1, 0.24, 1,0,0, 0.2950, 0,0,1
 -1, 0.39, 0,0,1, 0.5120, 0,1,0
  1, 0.63, 0,1,0, 0.7580, 1,0,0
 -1, 0.36, 1,0,0, 0.4450, 0,1,0
. . .

The network definition looks like:

class Net(T.nn.Module):
  def __init__(self):
    super(Net, self).__init__()
    self.hid1 = T.nn.Linear(8, 10)  # 8-(10-10)-1
    self.hid2 = T.nn.Linear(10, 10)
    self.oupt = T.nn.Linear(10, 1)


  def forward(self, x):
    z = T.tanh(self.hid1(x))
    z = T.tanh(self.hid2(z))
    z = self.oupt(z)  # regression: no activation
    return z

I implemented program-defined train() and accuracy() functions. The train() function is:

def train(model, ds, bs, lr, me, le):
  # dataset, bat_size, lrn_rate, max_epochs, log interval
  train_ldr =, batch_size=bs,
  loss_func = T.nn.MSELoss()
  optimizer = T.optim.Adam(model.parameters(), lr=lr)

  for epoch in range(0, me):
    epoch_loss = 0.0  # for one full epoch
    for (b_idx, batch) in enumerate(train_ldr):
      X = batch[0]  # predictors
      y = batch[1]  # target income
      oupt = model(X)
      loss_val = loss_func(oupt, y)  # a tensor
      epoch_loss += loss_val.item()  # accumulate
      loss_val.backward()  # compute gradients
      optimizer.step()     # update weights
    if epoch % le == 0:
      print("epoch = %4d  |  loss = %0.4f" % (epoch, epoch_loss)) 

After saving the data, I slightly edited the Window version of the Torch program by changing the Windows “\\” file path separators to the Linux-based “/” separators. I also edited the PeopleDatset class to use comma characters as the delimiter instead of the tab characters that I used for the Windows version.

After correcting a couple of the usual minor typos, the regression program ran on my MacBook.

According to the not-so-always-accurate Internet, three jobs that have surprisingly high incomes are: 1.) crime scene cleanup, 2.) exotic dancer, 3.) movie cat wrangler. For me these are: 1.) no thanks, 2.) wrong gender, 3.) not qualified.

This entry was posted in PyTorch. Bookmark the permalink.

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s