Bing Maps 8 Color Gradient Legend

I’ve been investigating the Bing Maps V8 library. The documentation has many good but very short examples. The way I learn best is by taking existing examples and modifying them.


I created a demo that combines several features. The color gradient legend in the upper left corner of the map is created programmatically. Behind the scenes, legend information is stored so that each value between 0 (purple) and 100 (red) can be mapped to a color encoded as an RGB triple.

The legend is rendered as a custom pushpin. Whenever the user zooms the map or scrolls, the legend is redrawn so that it always appears in the upper left corner.

The demo has the shape of Colorado stored in WKT (“well known text”) format. When the user clicks on the “Color Colorado” button, the number value in the textbox (90 in the image) is fetched, converted to a color, and that color is used to shade Colorado.

Very interesting exercise.


Posted in Miscellaneous | Leave a comment

A Quick Look at GNU Octave 4.0

One of the most common data analysis tools is MatLab. But MatLab is a commercial product and is quite pricey so several free MatLab-like tools have been created. Two of the most popular of these semi-clones are SciLab and GNU Octave.

I hadn’t looked at SciLab or Octave in quite a few months so I thought I’d see what’s new in Octave. (I’ll revisit SciLab some other time).

Octave version 4.0 was released in June 2015. The current version as of the day I’m writing this blog post is 4.0.3 which was released in July 2016.


My last use of Octave was with a version 3. I found a big change in version 4, namely, that Octave now comes with a GUI shell by default, as opposed to v3 which was command-line only.

I installed Octave on a machine running Windows 10. The installer gave me a warning that Octave 4 has not been thoroughly tested on Windows 10 but I installed anyway. Installation went smoothly.

I tried one of the standard Hello World examples for Octave, creating a 3D plot of the so-called peaks function.

Before version 4 of Octave, I preferred SciLab, but with Octave’s nice new GUI, my preference has shifted to Octave, mostly because Octave has closer compatibility to MatLab than SciLab does.

Posted in Machine Learning | Leave a comment

WPF Performance Analysis with Visual Studio 2015 Update 2/3

A WPF application (Windows Presentation Foundation) is the modern way to create desktop applications for Windows systems. With all the interest and hype surrounding Web applications and Mobile applications, it’s easy to forget that desktop applications still have a role in software development.

I’ve got to admit that I’m not a fan of WPF’s UI model which uses a specialized form of XML called XAML to define buttons and layouts — for me truth lies in code and XAML is just an abstraction that gets translated to code behind the scenes. That said however, overall I do like WPF.

Visual Studio 2015 released its Update 3 a few weeks ago and I was poking around to see what’s new. One thing I noticed is a big increase in profiling and analysis tools. These tools can be accessed through a Diagnostics Hub. The Hub contains tools to analyze CPU, GPU, Resource Contention, Memory Usage, and “Application Timeline” (Time Usage).


I looked at the Application Timeline for WPF applications. The tool used to be called XAML UI Responsiveness Tool. Anyway, it’s a pretty sophisticated tool that allows you to examine the timing for different aspects of the UI for a WPF application. Pretty cool.


Posted in Miscellaneous | Leave a comment

Bing Maps V8 Talk at Microsoft Research

One of the coolest things about working in the technology field is that it’s constantly evolving. The new Bing Maps 8 library is a huge advance from Bing Maps 7. Yesterday, I gave a talk with Ricky Brundritt from the Bing Maps 8 team. Actually I only did three super short demos to illustrate what I felt were the most important new features from a practical point of view. Ricky did most of the talking, presenting a thorough overview of Bing Maps 8.


From my perspective, the most important new features in Bing Maps 8 are two that relate to user interaction — creating pushpins programmatically and a drawing control — and two that relate to large data — heat maps and clustering.

After listening to Ricky’s part of the talk, I realized the Bing Maps 8 has a ton of new features. Many of these features target niche and specialized scenarios and were developed in response to customer/user requests. The number of new features in Bing Maps 8 is quite overwhelming — represents a ton of work by the Bing Maps team.

Posted in Conferences | Leave a comment

JavaScript Park-Miller Random Numbers

JavaScript is a very strange language in several regards. The built-in JavaScript random number generator is not seedable so you can’t get reproducible random numbers even when you want them (typically during development).

One of the oldest and best known RNG algorithms is sometimes called the Park-Miller algorithm, because of a really well-written article by Park and Miller even though they didn’t create the algorithm. I coded up a JavaScript implementation of a Park-Miller RNG that’s good enough for lightweight work.


One drawback of the JavaScript implementation is that it uses a global variable named seed, so you have to be careful not to use that name elsewhere. The implementation is very odd too, because it defines a function declaration that returns an anonymous function which in turn is assigned as a member to the existing Math object! Kind of crazy, but that’s JavaScript.



While experimenting with the Park-Miller RNG, I noticed some unusual behavior. First, for typical small seed values (1, 2, 3, etc.), the first generated number is very small. Second, the eight and ninth generated numbers are always very close to each other. A solution to this undesirable behavior is to burn away 20 or so initial generated values. I searched the Internet and couldn’t find much information on the undesirable behavior of the Park-Miller algorithm.

Posted in Machine Learning | Leave a comment

Betting Strategies

It’s well known that if you’re playing some betting game many times where the probability of winning each time is less than 0.50 (for example, Craps or Roulette) then no matter what your money management betting strategy is, you’ll lose in the long run. This is one part of what’s called the Gambler’s Ruin.


The standard theoretical attempt is when a player doubles his bet after each loss, the idea being that eventually he’ll win. For example: bet 1 and lose, bet 2 and lose, bet 4 and lose, bet 8 and win (at this point the player is ahead 1) bet 1 . . etc.

The problem with this strategy is that eventually you’ll hit a maximum bet limit set by the House.

Well, that’s all true but money management is still important. When I play Blackjack I count cards (I learned how to do this while in college) so my probability of winning on one hand is extremely close to 0.50. For a betting session, I typically start with 20 units ($100 on a $5 table, $500 on a $25 table and so on). I play until I win 10 units or I lose my 20 units stake or one hour passes (because I’m getting tired and will start to make mistakes).

Using this strategy, I typically leave the table a winner about 60% of the time. (Of course my overall expectation is still negative). There’s a lot of psychology involved too — for example, losing $200 hurts a lot but losing $400 hurts more than twice as much as losing $200.

Posted in Machine Learning | Leave a comment

JavaScript Wichmann-Hill Random Number Generator

The JavaScript language has a built-in Math.random() function that emits random numbers between 0.0 and 1.0 but that RNG is not seedable so you get different values each time the function is called even in situations where you want reproducible results.


Some time ago I coded up a seedable JavaScript RNG that is very crude. It worked by taking the trigonometric sin of the previous generated value and then using just the decimal part.

For fun I decided to see if I could implement the Wichmann-Hill algorithm in JavaScript. Wichmann-Hill is an old algorithm but is pretty effective. It scrambles up three sort of random values and then combines them to get a pretty good random value.


My Wichmann-Hill RNG looks good enough for casual use but I didn’t subject it to thorough testing.

Note: Thanks to Stephen Jones (Consilio) who pointed out weaknesses of the sin-algorithm and suggested taking a look at the XOR-shift algorithm.

Posted in Machine Learning | Leave a comment