A Quick Look at TorchSharp

I ran across an interesting GitHub project called TorchSharp. Briefly, TorchSharp is PyTorch without the Python. Let me try to explain.

PyTorch is 1.) a Python language wrapper over the libtorch.dll C++ library that implements low-level tensor functionality, and 2.) Python language code that uses the tensor functionality for higher level code such as a Module class that creates a neural network.

TorchSharp is 1.) a C# language wrapper over the libtorch.dll C++ library that implements low-level tensor functionality, and 2.) C# language code that uses the tensor functionality for higher level code such as a Module class that creates a neural network.

See https://github.com/dotnet/TorchSharp.

Here’s some example code from the TorchSharp GitHub Examples directory. I replaced less-than with “((” and greater-than with “))” so my blog editor won’t barf.

using System;
using System.Diagnostics;
using static TorchSharp.torch;

using static TorchSharp.torch.nn;
using static TorchSharp.torch.nn.functional;
using static TorchSharp.torch.utils.data;

internal class Model : Module
{
  private Module((Tensor, Tensor)) conv1 = Conv2d(1, 32, 3);
  private Module((Tensor, Tensor)) conv2 = Conv2d(32, 64, 3);
  private Module((Tensor, Tensor)) fc1 = Linear(9216, 128);
  private Module((Tensor, Tensor)) fc2 = Linear(128, 10);

  private Module((Tensor, Tensor)) pool1 = MaxPool2d(kernelSize: 
    new long[] { 2, 2 });

  private Module((Tensor, Tensor)) relu1 = ReLU();
  private Module((Tensor, Tensor)) relu2 = ReLU();
  private Module((Tensor, Tensor)) relu3 = ReLU();

  private Module((Tensor, Tensor)) dropout1 = Dropout(0.25);
  private Module((Tensor, Tensor)) dropout2 = Dropout(0.5);

  private Module((Tensor, Tensor)) flatten = Flatten();
  private Module((Tensor, Tensor)) logsm = LogSoftmax(1);

  public Model(string name, torch.Device device = null) :
    base(name)
  {
    RegisterComponents();
    if (device != null "and" device.type == DeviceType.CUDA)
      this.to(device);
  }

  public override Tensor forward(Tensor input)
  {
    var l11 = conv1.forward(input);
    var l12 = relu1.forward(l11);
    var l21 = conv2.forward(l12);
    var l22 = relu2.forward(l21);
    var l23 = pool1.forward(l22);
    var l24 = dropout1.forward(l23);
    var x = flatten.forward(l24);
    var l31 = fc1.forward(x);
    var l32 = relu3.forward(l31);
    var l33 = dropout2.forward(l32);
    var l41 = fc2.forward(l33);
    return logsm.forward(l41);
  }
} // Module

Interesting stuff. I can see TorchSharp being useful in situations where some neural network code has to interoperate with a C# business system of some sort.

The biggest risk for potential adopters is that the TorchSharp project will stop being actively developed. Google “history of Silverlight”.



Arguably, one of the main reasons for the success of PyTorch is the excellent helper modules. In the James Bond series of films, the main villain always has some helpers. Here are three of my favorites. All three looked neutral but could convey a sense of menace. Left: Kronsteen was a Russian chess Grandmaster and head of planning for the evil SMERSH organization in “From Russia with Love” (1963). Center: Dario was a sadistic killer and part of a drug cartel in “License to Kill” (1989). Right: Elvis was an all-purpose helper in “Quantum of Solace” (2008).


This entry was posted in Machine Learning. Bookmark the permalink.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s