Several weeks ago I did an audio interview for a podcast for the “This Week in Machine Learning & AI” Web site. The interview has been posted. See https://twimlai.com/twiml-talk-013-understanding-deep-neural-networks-james-mccaffrey-interview/.
The interview was hosted by Sam Charrington. I’ve never met Sam in person (yet) but he will the moderator for the upcoming “Future of Data Summit” panel at the Interop ITX conference in May, where I’ll be a panelist along with speakers from IBM, Intel, and other companies. See https://twimlai.com/futureofdata/
Anyway, Sam had a very good grasp of current trends in deep learning, and he asked me a ton of interesting questions. I think I was able to give a reasonably intelligent answer or opinion to most questions (but like always in a live interview, I botched a couple of things).
We were originally scheduled to talk for just a few minutes, but the interview/conversation went on much longer than expected, in part because the field of deep learning is growing so fast, there’s exciting new ideas ever few months.
I had the most fun attempting to explain current research that is trying to go beyond image and text recognition and achieve text comprehension — a huge challenge. My new Microsoft colleague Paul Smolensky has some very interesting ideas in this area of research.
I also enjoyed talking about a relatively new type of neural network called a Residual Network (RN). An RN is somewhat similar to a LSTM network (Long Short Term Memory) which in turn is an enhanced variation of an ordinary RNN (Recurrent Neural Network). Whew! That’s a lot of relationships and acronyms.
I guess the moral of the story is that there is tremendous research activity in “exotic” neural network architecture. There are intense efforts going on at Microsoft, Google, and IBM in particular, but also at hundreds of other academic and research institutions.
If you have any interest in AI and ML, I recommend that you check out Sam’s Web site, and the Interop ITX Conference (links above).