Using Flot Graphing with a Simple ASP.NET Web Form Application

I was asked to create a Web application of some sort that could display graphs. I did some research on the Internet and quickly discovered there are many, many, many different approaches.

One approach that seemed promising was to use a very simple ASP.NET Web Form application (in C#), running on IIS, together with the Flot graphing library (which uses JavaScript). See the image at the bottom of this post.

My most important design guideline when creating anything, is keep it as simple as possible. I was more or less constrained to using Microsoft technologies, otherwise, instead of using ASP.NET for the base page, I might have used PHP running on IIS (or possibly Apache), plus the Flot library.

Anyway, first I set up IIS, which was not entirely trivial. I wrote an earlier blog post on that task. Next, I launched Visual Studio, running in Administrator mode (required when making a new Web site), which also wasn’t completely trivial because I was using the mildly developer-hostile Windows 8.1 operating system.

Next, I created a new C# Empty Web Site. I find ASP.NET quite irritating in some situations. When creating a Web-Something, you can go File | New | Project | Visual C# | ASP.NET Web Application. Or, you can go File | New | Web Site | Visual C# | ASP.NET Empty Web Site. The two approaches yield entirely different systems and the differences are complex. They’re explained at https://msdn.microsoft.com/en-us/library/dd547590.aspx. Creating an Empty Web Site is the simplest approach. I named mine TestEmptyWebSite. (Note: File | New | Web Site | Visual C# | ASP.NET Web Forms Site requires a version of SQL Server — just freaking crazy.)

01


02

After the empty Web site was created, I right-clicked on the bolded Web site name in the Visual Studio Solution Explorer window, and selected Add | Add New Item | Web Form and accepted the default name of Default.aspx for the form.

03

In the Visual Studio editor window, I added a message and made sure my Web app / page could display.

04

To add graphing capability, I downloaded the Flot graphing library. It’s all JavaScript. I created a sub-directory named Include in the TestEmptyWebSite and then copied all the Flot files into the Include directory. I could have put the Flot files in a root directory so they’d be available to other Web applications.

Finally, I added some simple JavaScript code to test the Flot graphing with hard-coded data in an array. To read data from a text file or SQL database, I’d add some ADO.NET C# code, then transfer the data into a JavaScript array.

05

Posted in Miscellaneous | Leave a comment

Coding Neural Network Back-Propagation using C#

I wrote an article titled “Coding Neural Network Back-Propagation using C#” in the April 2015 issue of Visual Studio Magazine. See http://visualstudiomagazine.com/articles/2015/04/01/back-propagation-using-c.aspx.

CodingBackPropagationArticle

As I explain in the article, you can think of a neural network as a complex mathematical function that accepts numeric inputs and generates numeric outputs. The values of the outputs are determined by the input values, the number of so-called hidden processing nodes, the hidden and output layer activation functions, and a set of weights and bias values.

A fully connected neural network with m inputs, h hidden nodes, and n outputs has (m * h) + h + (h * n) + n weights and biases. For example, a neural network with 4 inputs, 5 hidden nodes, and 3 outputs has (4 * 5) + 5 + (5 * 3) + 3 = 43 weights and biases. Training a neural network is the process of finding values for the weights and biases so that, for a set of training data with known input and output values, the computed outputs of the network closely match the known outputs.

By far the most common technique used to a train neural network is called the back-propagation algorithm. Back-propagation is quite complicated. The key to back-propagation is to compute what’s called a gradient for each weight and bias. The gradient is a numeric value that tells you whether to increase or decrease its associated weight or bias, and gives you a hint at how big the increase or decrease should be.

Posted in Machine Learning | Leave a comment

Neural Networks at the 2015 Interop Conference

The Interop Conference is one of the largest IT conferences in the world. I will be speaking at the 2015 event, from April 27 to May 1. See http://www.interop.com/lasvegas/.

My talk is titled “Solving Business Problems With Neural Networks”. I recently sat down with Clark Buckner, who works for Technology Advice, and discussed trends in IT and software, my upcoming talk, and a bit about my role at Microsoft Research.

Interop2015_2

To join me at Interop Las Vegas, go to http://www.interop.com/lasvegas and enter the speaker discount code: SPEAKERVIP for 25% off the price of the Total Access, 3 Day Conference, 2 Day Workshop passes or a Free Expo pass.

You can listen to a 15-minute audio podcast of my talk with Clark at:


Here is a transcript of some of the podcast:

James McCaffrey, Senior RSDE at Microsoft Research, was a recent guest on the TechnologyAdvice Expert Interview Series to share his insight on trends and challenges in the business intelligence industry. The series, which is hosted by TechnologyAdvice’s Clark Buckner, explores a variety of business and technology landscapes through conversations with industry leaders.

McCaffrey joined Buckner to discuss neural networks, machine learning, and the topics he will be covering during his presentation at Interop Las Vegas: Solving Business Problems with Neural Networks. (See http://www.interop.com/lasvegas/scheduler/session/solving-business-problems-with-neural-networks )

Here are a few of the highlights from our conversation:

TA: What is the biggest trend you’ve noticed in business intelligence this year?

James: The one thing that stands out more than anything else is that there’s just been a huge increase in interest in business intelligence and data science over the last six to eight months. Business intelligence has been around for many years, but because organizations can now harvest that data and store it in the cloud, the demand has exploded. Organizations now have access to all this data that they can dive into and make accurate decisions that affect their bottom lines.

In the 1990’s, people knew the Internet existed, but didn’t really understand what it was. Then it seems almost overnight the Internet became an essential tool for conducting business. In the same fashion, we’re seeing business intelligence become increasingly vital to driving key decisions in companies of all different sizes, and this switch will happen very fast.

Right now, we’ve raced ahead in our ability to capture data and what’s lagging behind is the ability to analyze those vast amounts of data. Traditional techniques were developed in the 1930’s, 40’s and 50’s before computers. And they rely on classical methodologies like calculus. Literally I’d say within the next one to two years we’re going to develop tools that will be able to support and crunch the data into valuable insights.

TA: Give us some insight on what you will be presenting at Interop Las Vegas this year.

James: Neural networks are one form of what’s called machine learning. Machine learning is just sort of a fancy way to say, ‘making predictions using data.’ Neural networks are these things that most people are somewhat vaguely familiar with — they are software systems that are modelled on biological synapses and neurons. In my talk at Interop, I’m going to explain this in detail so the information is easily understood — help people wade through all the scientific terms and industry jargon. That way we can delve into why they’re one of the key components that organizations can use to make powerful predictions using data.

Consider this, think about all the data that has been collected from the dawn of civilization until now. I’m talking about cave paintings, Egyptian papyrus scrolls, medieval books, modern books, electronic data, email messages, photos on social media, video and every other form of data you can think of. Now, think about this huge amount of data. Stunningly, over 90 percent of that data has been generated and collected in the last 12 months. It’s just astonishing. The rate of data generation collection is growing at 50 percent every year. This is almost impossible to wrap your head around — and that’s why neural networks and other forms of machine learning are growing so much in popularity.

TA: Wow, that’s a staggering statistic — but it certainly makes sense considering all industries seem to be culling data any way they can. For instance, email marketers are always collecting data on their consumers to find the best marketing automation strategies that will increase conversion rates. How can businesses keep up with constantly changing technology like this without falling behind? (See http://technologyadvice.com/marketing-automation/smart-advisor/?tid=PODCAST-mktauto-jamesmccaffrey-interop/#guide )

James: There is a real risk for organizations that don’t take action and be proactive about business intelligence. Think about what happened to a lot of the printed publishers, magazines, newspapers etc. Instead of seeing the Internet as a threat or ignoring it altogether, the companies that are still around are the ones that embraced it and learned to adapt it as a new way to do business. The companies that waited too long or tried to skirt around the Internet are the ones that went under.

The single best way to keep up with the latest industry trends, in my opinion is attending conferences. It’s one thing to read about stuff online through various trade outlets — I personally write for InformationWeek (http://www.informationweek.com/) and MSDN Magazine — but at events like Interop, you get to immerse yourself in the community and latest news. The sessions and workshops are very insightful, but I find the most value from ad hoc, unplanned conversations and meetings with other attendees. Networking and interacting with your peers is priceless.


Listen to the entire show above in order to hear our full conversation, or download the show to listen later. You can subscribe to the TA Expert Interview Series via Soundcloud (see https://soundcloud.com/ta-expert-interviews/), in order to get alerts about new episodes. You can also subscribe to the Interop interviews from TechnologyAdvice (see https://soundcloud.com/ta-expert-interviews/sets/interop-las-vegas-speakers ).

This podcast was created and published by TechnologyAdvice. Interview conducted by Clark Buckner (see https://www.linkedin.com/in/clarkbuckner ).

Posted in Conferences, Machine Learning | Leave a comment

Multi-Class Logistic Regression Classification

I wrote an article “Multi-Class Logistic Regression Classification” in the April 2015 issue of Microsoft MSDN Magazine. See https://msdn.microsoft.com/en-us/magazine/dn948113.aspx.

LogisticMultiClassDemoRun

The goal of a multi-class logistic regression problem is to predict something that can have three or more possible discrete values. For example, you might want to predict the political inclination (conservative, moderate, liberal) of a person based on predictor variables such as their age, sex, annual income, and so on.

Regular logistic regression (LR) predicts something that can have one of just two possible values. For example, predicting the sex (male, female) of a person. Regular logistic regression is one of the most basic forms of machine learning. In regular LR, one of the two possible predictions is arbitrarily assigned a 0 and the other possible value is assigned a 1. For example, male = 0 and female = 1. The input values produce a single numeric output value between 0.0 and 1.0. If the output value is less than 0.5 (i.e., closer to 0) you predict the 0 result (male). If the output value is greater than 0.5 (i.e., closer to 1) you predict the 1 result (female).

Multi-class logistic regression extends this idea. If there are three possible classes in the thing to predict, for example, (conservative, moderate, liberal) then the input values produce three numeric values that sum to 1.0. For example, the output values might be (0.20, 0.70, 0.10). The predicted class is the one which corresponds to the largest output value (moderate).

Multi-class logistic regression isn’t used very much. I suspect this is because writing the code for multi-class LR is quite a bit trickier than for regular LR.

Posted in Machine Learning

An ASP.NET Web Form using jQuery get() to call Classic ASP

Most of my software development work is numerical and scientific programming, usually in a shell. But every now and then I have to do some Web work. I was looking at the jQuery get() function, which sends an HTTP GET request from the client browser to the Web server, and the fetches the response from the server.

JQueryGetDemoRun

I coded up a little demo program to make sure I understood the get() documentation. My demo was a bit unusual because I was just experimenting. The demo is essentially an ASP.NET Web Form application which contains jQuery code to send a request to a classic ASP script.

To create the demo, I first used Visual Studio to create an “Empty Web Site” as opposed to any other template which gives more than what I wanted. Then I used Notepad to create the ASP.NET demo. Because my annoying blog editor usually messes up anything with less-than or greater-than symbols, here’s an image of the Demo.aspx (and the target classic ASP script).

JQueryGetDemoCode

Before I could run the demo, I needed a target to send the GET request to. I decided to use classic ASP written in VBScript. The Test.asp code looks for a request that has something like x=42 in the query string, peels off the value of x, and returns twice the value.

The whole idea of the exercise was to understand the jQuery.get function. My call to get() is wrapped inside the ready() function so it will execute immediately after the Demo.aspx page loads. The get() call has three parameters. The first is the classic ASP target script. In most cases the target is just a data file. The second parameter is the information to send. In most cases this parameter is left out. I passed two values, x and t, but only x (with value 42) is used. The third parameter is an anonymous callback function that executes when the target returns a response. Here, object ‘theData’ is the return result. I just pasted the result (84) into a div on the Demo page.

Posted in Machine Learning, Miscellaneous

Enabling IIS on a Windows 8.1 Machine

I installed IIS on a Windows 8.1 machine but my Web pages were only visible from the machine with IIS, not from any other machine on my work network. To fix I had to modify a setting in Windows Firewall.

It all started when I needed to do some experiments with Web applications. I had a relatively new Dell desktop machine so I figured I’d install IIS (Internet Information Services) on the machine. How hard could it be?

First, I correctly remembered that you don’t really install IIS, instead you sort of activate IIS. So I went to Control Panel, Programs and Features -> Turn Windows features on or off. In the dialog window I expanded the Internet Information Services item and checked IIS Management Console under the Web Management Tools section, and the all the sub-items in the Application Development Features item.

01_WindowsFeatures

Then, to test, I launched Notepad with “Run as administrator” (necessary) and created a tiny hello.html page, and saved it at C:\inetpub\wwwroot\Hello (I first created the Hello directory).

I launched Internet Explorer and navigated to //localhost/Hello/hello.html and my page displayed. For a moment I thought everything was good. But when I tried to view the hello page from another machine on my work network, the page was not accessible. I don’t remember the exact error message.

Then I remembered that there was some additional setting I needed but I couldn’t remember exactly what it was. I only set up a Web server once every couple of years or so. Anyway, after some Internet searching I remembered I had to adjust the Windows Firewall “World Wide Web Services” setting from Enabled = No to Enabled = Yes. I got to the firewall settings from Control Panel -> Administrative Tools -> Windows Firewall with Advanced Security, then the Inbound Rules item, then Properties.

02_IncomingConnections

After enabling the WWWS inbound rules, I restarted IIS from the IIS manager, and my Web page was then visible from other machines on my network.

03_Restart

Posted in Miscellaneous

Python Tools for Visual Studio

I do the majority of my software development using the C# language, but I also use C, C++, Perl, JavaScript, and others, including Python. Many of my colleagues believe that some languages just “have the right feel” to them. For me, Python is one of those good languages.

Until recently, I had always programmed scripting languages using the plain old Notepad editor. There’s something satisfying about using the simplest editor available. But I decided to try out the Python Tools for Visual Studio (PTVS). I hate the name but really like the tool. In essence, PTVS allows you to edit and run Python programs (OK, “scripts” if you want to get technical) in the Visual Studio program.

PythonToolsForVisualStudioInAction

Let me cut to the chase and say I was very impressed by PTVS. The installation was quick, easy, simple, and trouble-free. I am always fearful of installing third-party tools because if my system gets messed up it can be very painful to recover. After installation, using PTVS was easy and intuitive, although part of this ease-of-use is probably due to my years of experience with Visual Studio.

Now, to be honest, PTVS wasn’t earth-shattering, but three things made writing Python programs much easier than using Notepad. First, the editor colors keywords, string literals, parameters, and so on. I was surprised at how much difference this made. Second, PTVS in VS automatically indents when appropriate. (Important because indentation is syntactically significant in Python). Third, the auto-complete feature called IntelliSense really helps for library function names. In addition to these three features, the debugging mode in Visual Studio was nice too.

Because I’m only a casual user of Python, I don’t know what the industry standard integrated editor is in Python-world. Or if there even is a most-common integrated editor. But for me, because I spend most of my developer hours in Visual Studio, PTVS is perfect.

Posted in Machine Learning, Miscellaneous