Here, quite briefly, are the notes I took from last night's airing of The Smartest Machine on Earth
on the PBS show NOVA. I had intended these to be paragraphs, but sometimes we don't get what we want. Enjoy!
- It’s official: Watson is a “He.”
- 10:08 PM ET: First reference to HAL.
- The typical Jeopardy! contestant takes less than three seconds to buzz in.
- Computers have historically struggled to understand subtleties and recognize objects, something that comes naturally to humans.
- The question that got it all started: “What if a computer could play Jeopardy! as well as Ken Jennings?”
- Great quote from David Ferrucci: “Maybe this isn’t as impossible as we think it is.”
- Watson researchers plotted Jeopardy! winners and their scores in a cloud. “It was a scary metric,” wrote researcher David Gondek during the live blogging session. [But] to Dr. Ferrucci, “the cloud made it clear that Jeopardy! was about answering accuracy and confidence.”
- Early struggles: Jeopardy! champions play at a 90 percent accuracy rate. Early versions of Watson struggled to get even 10 percent of the clues right.
- Early Watson systems were fed hundreds of Jeopardy! questions. To answer them, Watson scanned and cross-referenced keywords in its database, but confused Edmund Halley with the Pink Panther.
- Why Watson is different than Deep Blue: Chess is “easy” for computers to understand because the rules are clear and well-defined. The board has a fixed number of squares, each piece can only move in certain ways and there's only one goal: Capture the King. However, computers play it differently than humans. Whereas humans approach the game using strategic concepts (control the center, attack from the flanks), computers look at every possible move and play out the repercussions 40 steps ahead. It’s sheer brute force. Jeopardy! by comparison, is much less predictable. Categories span the entirety of human knowledge and clues can be written in thousands of different ways.
- Can you teach a computer common sense? Yes, up to a point. You can write code that tells it that “dead is forever” but the approach will never help it understand basic principles needed to answer sophisticated questions. For that, you need to pursue Machine Learning.
- Machine Learning is the way computers learn that “A” is an “A,” whether it’s rendered in Helvetica, Uncial or Crayloa. Of the dozens of pieces of data that go into an answer, Machine Learning is the technique that helps a computer decide which pieces are important and which to drop. It replaces brute force with flexibility. It’s the same technology that’s used to generate predictive weather models and Netflix recommendations. Once Watson researchers turned down that path, the system’s accuracy headed straight for the champion cloud.
- Watson’s audition in front of the Jeopardy! producers was “one of the tensest days” the team ever had. In the early days, the system wasn’t ready for prime time. The performance was too erratic and had trouble pronouncing Roman numerals: "Who is...Henry Veeee?"
- Ferrucci: Watson is an information-seeking tool capable of understanding your questions and dialoguing with you to make sure you get what you want.”
To my ears, that last point sounds a lot like the conversations between IT and Business in a typical business analytics deployment. Could Watson be the bridge between these two solitudes?