My Top Ten Favorite World Chess Championship Blunders

Here are my top 10 favorite blunders made in world chess championship matches. The word blunder is too harsh a term but is commonly used in chess to describe almost any kind of oversight or miscalculation. By favorite, I mean that I could understand the mistake (many mistakes made by chess grandmasters are much too subtle to be understood by average chess players), the mistake was interesting, and it was taken advantage of by the opponent. In the recent 2014 match between V. Anand and M. Carlsen, Carlsen made a huge mistake but Anand did not see it so it’s not included here. It was that mistake which prompted me to review previous championship matches for other mistakes.

A world chess championship match has been held every few years since the late 1800s. Although there have been disputes and chaos, the list of more or less official world champions has 16 members: Wilhelm Steinitz (1886), Emanual Lasker (1894), J.R. Capablanca (1921), Alexander Alekhine (1927), Max Euwe (1935), Mikhail Botvinnik (1948), Vasily Smyslov (1957), Mikhail Tal (1960), Tigran Petrosian (1963), Boris Spassky (1969), Bobby Fisher (1972), Anatoly Karpov (1975), Garry Kasparov (1985), Vladimir Kramnik (2000), Vishwanathan Anand (2008), and Magnus Carlsen (2013).

World chess championship games are played at an unbelievably high level, but even so, mistakes that an average chess player can understand are sometimes made. Here is my list of mistakes, in chronological order.


1. Lasker vs. Steinitz (1894, Game 5). In the diagrammed position below, Steinitz has just played 41… Qd7. Lasker won a queen immediately with 42. Qg1+ d4 43. Qg5+ Qd5 44. Rf5.

LaskerSteinitz


2. Capablanca vs. Lasker (1921, Game 5). In the diagrammed position below, Lasker has just moved his king out of check with 45… Kf8. Capablanca won a knight immediately with 46. Qb8+ Ke7 47. Qe5+ QxQ 48. RxQ+ followed by RxN.

CapablanceLasker2


3. Bronstein vs. Botvinnik (1951, Game 6). In the diagrammed position below, Bronstein with the white pieces, instead of moving his knight with Ne6+ followed by Nd4 to stop black’s advanced pawn, has just played 57. Kc2. But Botvinnik won with 57… Kg3! (instead of Kf3) and the pawn cannot be stopped.

BronsteinBotvinnik


4. Tal vs. Botvinnik (1960, Game 7). In the diagrammed position below, Botvinnik has just played 25… Bg6. Tal won quickly with 26. Rxd7+ Nxd7 27. Rxd7+ Kxd7 28. Nf6+ forking king and rook, ending up a piece ahead.

TalBotvinnik


5. Petrosian vs. Spassky (1969, Game 14). In the diagrammed positioon below, Spassky has just advanced his king with 43… Ke4. Petrosian played 44. f3+ Kxe3 45. Rd2 threatening Re2 mate. To escape, Spassky gave up his rook for a knight with Rb3+ and, surprisingly, was able to draw the game.

PetrosianSpassky


6. Spassky vs. Fisher (1972, Game 1). In the diagrammed position below, Fisher has just captured white’s rook pawn with 29… Bxh2. Spassky traps and wins the bishop with 30. g3 followed by Ke2, Kf3, Kg2. This game shocked the chess world.

SpasskyFisher


7. Karpov vs. Korchnoi (1981, Game 2). In the diagrammed position below, Korchnoi has just made the move 34… f6 to free his bishop. Karpov pounced with 35. Rxa7 winning a free pawn because the rook cannot be recaptured because of 35… Qxa7 36. Qxe6+ Kh7 37. Qxc8 winning a knight.

KarpovKorchnoi


8. Karpov vs. Kasparov (1990, Game 7). In the diagrammed position below, Kasparov has just played 27… Qa5. Karpov won a bishop with 28. Nd5 because if 28… Qxd2 29. Nxf6+ Kg2 30. Bxd2 and the knight on f6 is now protected by the rook on f1.

KarpovKasparov


9. Anand vs. Gelfand (2012, Game 8). In the diagrammed position below, Gelfand has just played 16… Qxh1 after a three move combination. Gelfand missed that Anand could play 17. Qf2 trapping black’s queen in the corner with no escape. This was the shortest championship game in history.

AnandGelfand


10. Carlsen vs. Anand (2014, Game 2). In the diagrammed position below, Anand has just played 34… h5. Carlsen won immediately with 35. Qb7 threatening Rxg7+ followed by mate, to which there is no defense.

CarlsenAnand


Posted in Miscellaneous, Top Ten | Leave a comment

Speech Recognition with .NET Desktop Applications

I wrote an article titled “Speech Recognition with .NET Desktop Applications” in the December 2014 issue of Microsoft’s MSDN Magazine. See http://msdn.microsoft.com/en-us/magazine/dn857362.aspx. In the article, I explain how to add voice commands to Windows desktop applications (WinForm apps, WPF apps, console shell apps).

WinFormDemo

When I write, I usually present information that is new and isn’t available anywhere else. This article is an exception — all the information is available in documentation. But, in my opinion, the existing documentation was just awful, so I presented two complete end-to-end demo programs, and included detailed information about how to install the necessary libraries, and discussed pros and cons of various implementation alternatives.

By awful existing documentation, I mean that there is a huge amount of information on speech recognition, but it is disorganized, and just not useful from a software developer’s point of view.

As I note in the article, “Mastering the technology itself isn’t too difficult once you get over the initial installation and learning hurdles. The real issue with speech recognition and synthesis is determining when they’re useful.”

Posted in Machine Learning | Leave a comment

A Big Gravity Powered Mechanical Computer

When I was a young man I remember seeing some mechanical, educational computer toys that absolutely fascinated me. I vividly recall the Hasbro “Think-A-Tron” (1960) that answered questions on little mini-punch cards, and the plastic Digi-Comp mechanical calculator (1967).

GraviCompFullFromLeft

Several people have built recreations of the Digi-Comp. I decided to do so too. I had access to some scrap acrylic plastic and a laser cutter. I designed my version (I called it Gravi-Comp just to distinguish it from the commercial re-creations) to be totally modular. The idea there is that I wanted to be able to test each module individually, and modify and replace each module when necessary, rather than have to modify the entire device. I had a lot of help from several colleagues, including Kirk Olynyk, Nathan Brown, and Chris O’Dowd.

Some video colleagues made a very short video that gives you a rough idea of what the device is like in action.

SnapshotOfVideo

Click here to see a short video clip

The resulting machine has about 150 separate modules, most of which are either 4 in. x 2 in., or 4 in. by 4 in. in size. The device ended up being much larger than I thought it’d be. Overall, it’s 12 feet long by 3 feet wide. When tilted on its stand, the device stands over 7 feet tall.

ModuleTestStand

These mechanical computers are programmed by manually setting a series of switches (about 32 of them). At the top of the device is a hopper that holds 1.25 in. diameter acrylic balls. The balls are released one at a time. As each ball drops, it passes through switches, toggling some of them to the left or the right. The final answer to the programmed problem is stored as a binary number in seven of the switches.

I’m not exactly sure what I’ll do with the Gravi-Comp now that it’s finished, but it was a lot of fun to build it on weekend mornings.

Posted in Miscellaneous | Leave a comment

Fireworks Algorithm Optimization

I wrote an article titled “Fireworks Algorithm Optimization” in the December 2014 issue of MSDN Magazine. See http://msdn.microsoft.com/en-us/magazine/dn857364.aspx.

FireworksAlgorithmGraphPicture

Many types of machine learning (ML) involve some sort of mathematical equation that makes predictions. Examples include logistic regression classification, and neural network classification. These systems have a set of numeric values called weights. Training the system is the process of finding the values of the weights so that the error (between computed output values and known output values in a set of training data) is minimized. In other words, training is a numerical optimization problem.

There are at least a dozen different major numerical optimization algorithms. Examples include simple gradient descent, back-propagation, and particle swarm optimization. Fireworks algorithm optimization (FAO) is a new idea proposed in 2010. The algorithm is named as it is because when the FAO process is displayed on a graph, the image sort of resembles a set of exploding fireworks.

In the article, I present a demonstration coded using the C# language. I found FAO quite interesting. In my opinion, there’s not enough research evidence to state exactly how effective FAO is or isn’t.

Posted in Machine Learning

Selecting Excel Cells using Interop and Speech Recognition with C#

I’ve been investigating the idea of manipulating Excel spreadsheets by using voice commands. One task is to select a range of cells along the lines of “Select cells A1 to E7″. The trick to get this to work is to use an Excel add-in (for some UI), Excel interop (to manipulate a worksheet programmatically), and speech recognition (to input a command).

SelectingCellsWithSpeech

So, there are a lot of prerequisite needed and they are quite version-sensitive. I was using Excel 2013 with Visual Studio 2012. To create the add-in I installed the Microsoft Office Developer Tools for Visual Studio 2012 ENU. This gave me an Excel add-in template so when I did File -> New -> Project, I was able to select the C# -> Office Add-ins -> Excel 2013 Add-in template. Apparently this also gave me the ability to do interop because the template generated code contained a ” using Excel = Microsoft.Office.Interop.Excel;” statement.

To use speech, I installed three 32-bit (64-bit doesn’t work with add-ins) Microsoft Speech Platform 11 packages: the SDK (to create speech recognition in VS), the Runtime (to use speech on my client machine), and English Language pack MSSpeech_SR_en-US_TELE.msi for recognition. Because I was not using speech synthesis, that is, making the computer talk, I did not download and install MSSpeech_TTS_en-US_Helen.msi (“text-to-speech”).

I named my demo Project ExcelAddInTest. In the Solution Explorer window, I right-clicked on the Project and selected Add -> New Item, User Control and renamed the associated file from UserControl1.cs to MyUserControl.cs. This brought me to a design surface with what is essentially a WinForm. I added a ListBox control to display diagnostic messages.

In file ThisAddIn.cs, I added this code:

public partial class ThisAddIn
{
  public MyUserControl muc;
  public CustomTaskPane ctp;

  private void ThisAddIn_Startup(object sender, System.EventArgs e)
  {
    muc = new MyUserControl();
    ctp = this.CustomTaskPanes.Add(muc, "MyControl");
    ctp.DockPosition = 
Office.MsoCTPDockPosition.msoCTPDockPositionTop;
    ctp.Visible = true;
  }

  // template code here, no changes
}

I did a quick sanity check by Building the Solution and then launching an Excel 2013 spreadsheet to verify I could see a ListBox named MyControl. I also did an Excel File -> Options -> Add-Ins -> Manage: COM Add-Ins, Go to verify the add-in was active and that I didn’t have any other old test add-ins.

I added the speech recognition functionality in the MyUserControl code, located in file MyUserControl.cs. First I added a reference to Microsoft.Speech.dll which was located at C:\ Program Files (x86)\ Microsoft SDKs\ Speech\ v11.0\ Assembly\ on my machine (spaces added to the path for readability. Then I added a using statement: “using Microsoft.Speech.Recognition;” at the top of the code.

I got the speech set up like so:

public partial class MyUserControl : UserControl
{
  static SpeechRecognitionEngine sre;

  public MyUserControl()
  {
    InitializeComponent();

      System.Globalization.CultureInfo ci =
 new System.Globalization.CultureInfo("en-us");
      sre = new SpeechRecognitionEngine(ci);
      sre.SetInputToDefaultAudioDevice();
      sre.SpeechRecognized +=
 new EventHandler(sre_SpeechRecognized);

      // TODO
}
. . .

I wanted a speech template to be like, “Select cells A1 E7″ so I need to recognize letters and numbers. I prepared to recognize letters like this:

Choices ch_Letters = new Choices();
string[] letters = new string[26];
for (int i = 0; i < 26; ++i)
{
  char cLetter = Convert.ToChar(i + 65);
  string sLetter = cLetter.ToString();
  letters[i] = sLetter;
}
ch_Letters.Add(letters);

Next I prepared to recognize numbers like so:

Choices ch_Numbers = new Choices();
string[] numbers = new string[20];
for (int n = 1; n <= 20; ++n)
  numbers[n-1] = n.ToString();
ch_Numbers.Add(numbers);

I just recognize 1 through 20; I found that there’s a real impact on the recognition confidence as you increase the number of numbers. Next I set up the beginning of the phrase:

Choices ch_SelectCommand = new Choices();
ch_SelectCommand.Add("Select cells");

And then I constructed the rest of the phrase to recognize, loaded the resulting Grammar, and activated speech recognition:

GrammarBuilder gb_SelectRange =
  new GrammarBuilder();
gb_SelectRange.Append(ch_SelectCommand);
gb_SelectRange.Append(ch_Letters);
gb_SelectRange.Append(ch_Numbers); 
gb_SelectRange.Append(ch_Letters);
gb_SelectRange.Append(ch_Numbers);
Grammar g_SelectRange = 
  new Grammar(gb_SelectRange);
sre.LoadGrammarAsync(g_SelectRange);
sre.RecognizeAsync(RecognizeMode.Multiple);

And then I wired up the speech-recognized event handler:

void sre_SpeechRecognized(object sender,
  SpeechRecognizedEventArgs e)
{
  try
  {
    string txt = e.Result.Text;
    double conf = e.Result.Confidence;
    if (conf = 0 &&
      txt.IndexOfAny(new char[] { 'A', 'B', 'C', 'D',
 'E', 'F', 'G', 'H' }) >= 0)
    {
      // "Select cells A 1 B 2" 
      string[] toks = txt.Split(' ');
      string s1 = toks[2] + toks[3];
      string s2 = toks[4] + toks[5]; 
          
      Excel.Worksheet active =
 Globals.ThisAddIn.Application.ActiveSheet;
      Excel.Range rangeToSelect = active.get_Range(s1, s2);
      rangeToSelect.Select();
      return;
    }
  }
  catch (Exception ex)
  {
    listBox1.Items.Add(ex.Message);
  }
}

I built the project and tested it by using speech to select cells A1 to E7 in a dummy spreadsheet as shown in the image above.

Posted in Machine Learning

My Top Ten Favorite Star Trek: The Next Generation Episodes

One weekend I decided to make a list of my top 10 favorite episodes of Star Trek: The Next Generation. It was quite difficult because the show has 178 episodes from seven seasons. Eventually I came up with this list. There are several Internet lists of STNG episode ratings, but my list does not seem to correlate closely with any of those lists.

I noticed that I tend to like episodes with elements of mystery and action, but didn’t like episodes with Q (his god-like powers just made stories too far-fetched), Guinan (just didn’t like that character’s all-knowingness), and the holodeck (a lazy way to create a plot).

1. “Cause and Effect” – The story starts with the Enterprise exploding, killing everyone. As it turns out, the ship is stuck in a time loop, with each loop a bit different. Will the Enterprise escape? Of course they will, but how? (Season 5, Episode 118).

Cause and Effect

2. “Frame of Mind” – Riker finds himself in an insane asylum. What is real? How did he get there? Is this an alternate dimension? A Romulan plot? As it turns out, it is an alien (but not Romulan) plot. (Season 6, Episode 147).

Frame of Mind

3. “A Matter of Perspective” – Riker is charged with the murder of Dr. Nel Apgar who was developing a new energy source. There are different testimonies but they seem to point to Riker’s guilt. It turns out that Apgar had tried to murder Riker, but accidentally killed himself. (Season 3, Episode 62).

A Matter of Perspective

4. “Skin of Evil” – A strange black oil-like creature (Armus) kills Tasha Yar. I thought for sure Yar would be revived, but no. Most STNG ratings lists put this episode as one of the worst, but I thought it was tense, scary, and well-acted. (Season 1, Episode 23).

Skin of Evil

5. “The Wounded” – The captain of the Federation ship Phoenix, Benjamin Maxwell, has gone rogue and destroys two Cardassian ships. Maxwell claims the Cardassians were preparing to attack. Were they? (Yes, they were) (Season 4, Episode 86).

The Wounded

6. “Parallels” – After competing in a fighting competition off-ship, Worf returns to the Enterprise but notices all kinds of subtle things are wrong. It turns out Worf is transitioning between parallel universes. (Season 7, Episode 163).

Parallels

7. “A Matter of Honor” – Riker is sent to a Klingon ship as part of an officer exchange program. I especially remember the scene where Riker’s authority is challenged – what would he do? Only one thing, a show of brute force, was the correct response. (Season 2, Episode 34).

A Matter of Honor

8. “The Defector” – A Romulan defector, sub-lieutenant Setal, comes to Picard with information about a Romulan invasion. Is it a trick? It turns out the defector was actually an admiral but he was duped with misinformation by his own people. (Season 3, Episode 58).

The Defector

9. “Datalore” – Data meets his evil brother Lore, and there is a dangerous crystalline space entity to deal with too. (Season 1, Episode 13).

Datalore

10. “Q Who” – The first appearance of the Borg puts this episode on my list automatically, even though I dislike the Q character and the Guinan character as lazy plot devices. The Borg are a cliche now, but they were novel and scary in 1989. (Season 2, Episode 42).

Q Who

Notes: There are many episodes that didn’t quite make my personal top 10. I remember the dinner bowl of worms in “Conspiracy” where alien parasites are taking over Star Fleet. The surprising return of Tasha Yar in “Yesterday’s Enterprise”. Wesley Crusher at the Academy in “The First Duty”. Picard as a prisoner in “The Chain of Command”. And of course, the two-part “The Best of Both Worlds” where Picard is assimilated by the Borg.

After I made my list, I went to a STNG ratings Web site. My top 10 were ranked by that site as 4th, 38th, 135th, 119th, 62nd, 9th, 47th, 54th, 57th, 12th respectively — essentially no agreement except for the “Cause and Effect” (my #1) and “Parallels” (my #6) episodes.

Posted in Top Ten

Neural Networks using Python

I wrote an article titled “Use Python with your Neural Networks” in the November 2014 issue of Visual Studio Magazine. See http://visualstudiomagazine.com/articles/2014/11/01/use-python-with-your-neural-networks.aspx. Although there are several Python implementations of neural networks available, sometimes writing your own code from scratch has advantages. You fully understand the code, you can customize it to meet your needs exactly, and you can keep the code simple (for example, by removing some error checks).

NeuralNetworksWithPython-VSM

Python is an interesting language. According to several sources, Python is currently one of the ten most common programming languages, and it’s making gains in education (thank goodness — it’s well past time for Java to go away from education). If you work mostly with languages other than Python, a good side effect of exploring a neural network implemented using Python is that the process serves as a very nice overview of the language.

Posted in Machine Learning | 2 Comments