IBM’s recent Jeopardy! victory also was a victory for those who work in the esoteric world of artificial intelligence (AI). AI has been around for years, but after some amazing visibility in the 1980s and 1990s, it seemed to go underground. As the Watson demonstration showed, though, AI is alive, well, and better positioned than ever to help solve human problems with the ever-growing information glut.

AI is mainly a software technology, but it greatly influenced hardware design. Back in the mid-1980s, AI gained popularity just after the big growth spurt of the IBM PC and all its clones. For some reason, fate, or maybe just a bad decision on my part, I got involved with AI then. I had sold my company in Florida and had been dumped by a failed startup, so I was looking for something interesting to do. AI was it.

It really turned out to be a fad and a niche, but a strange and interesting diversion for me, and I even wrote two books about it. Nevertheless, I don’t consider myself an expert in this very broad and highly academic field, so my tenure was short. By the mid-1990s I was on to something else.

Recently I came across one of my books, Crash Course in Artificial Intelligence and Expert Systems (SAMS, 1987), collecting dust on an old bookcase and started to wonder whatever happened to this weird and wacky technology. I just found out—it’s alive and well in Watson and more.

What the Devil is AI?

AI involves making computers act more like people. Specifically, AI makes computers mimic human characteristics, especially thinking and problem solving. First, designers create a knowledge base, similar to but more than a database, about a specific subject or problem. Then they implement some algorithms so the AI can apply the knowledge base to solve problems, answer questions, or do the work of a human. Conventional computing, on the other hand, uses more linear processing algorithms.

Search and pattern matching are some the key inferencing processes in AI. Probability and statistics also play a major role in some problems. These are successful but very time consuming techniques. However, computers have gotten faster at making these techniques more effective than ever. And with the massive, new, fast, low-cost memory, it is possible to build some humongous knowledge bases.

The main applications areas of AI have been expert systems that solve specific problems, natural language interpretation, robotics, computer vision, and education. The medical field was an initial beneficiary of expert systems that could take in symptoms and provide diagnoses based on a massive knowledge base.

Many special programming languages have been developed for AI, including LISP (list processing), Prolog, SAIL, LOGO, OPS5, Smalltalk, and a dozen others. They are no longer widely used, as most AI seems to be programmed in C or some derivative just like everything else these days.

I did manage to learn LISP, which is one of the oldest higher-level languages. It came along right after FORTRAN in the 1950s. Texas Instruments (TI) and a few others used to make specialized minicomputers called LISP machines whose architecture was optimized for LISP. If you like nested parenthesis, you will love LISP. It is bizarre but very effective for the kinds of problems you come across in AI.

What About Watson?

Watson, named after IBM founder Thomas Watson, is IBM’s latest supercomputer based on its POWER7 multi-core processors (see the figure). Its programming was optimized for Jeopardy! At the heart of it is the advanced natural language processing system, which can deal with the nuisances of our colloquial, idiomatic language and understanding the metaphoric, analogical, and other quirks of how we speak. But the software had to be tweaked repeatedly based on its performance while playing Jeopardy!

The knowledge base must be legendary with input from books, encyclopedias, and many other sources, but the language understanding and speech recognition are world class for sure. One key element of Watson is its ability to learn. That is a more recent development in AI that I never got to. Neural networks and other techniques let computers learn from their inputs and adapt accordingly.

If you would like to dig deeper on this subject, check out the recent book by Stephen Baker: Final Jeopardy, Man vs. Machine and the Quest to Know Everything (Houghton Mifflin Harcourt, 2011). This book takes a good look at the behind the scenes development of Watson and its predecessor, Deep Blue, which bested chess master Gary Kasparov not too long back.

Another good recent AI update was in Wired’s January 2011 issue. In his opening introduction to several AI articles, author Steven Levy says that “AI is all around us.” He continues on to explain that AI does not try to recreate the human brain, but instead uses techniques like computer learning, massive knowledge bases, and clever algorithms to solve specific problems. Other articles explain AI uses on Wall Street and in music, medicine, transportation, and other fields. Internet searches use AI methods, and we can probably look for more AI methods along with natural language understanding to make searches even better.

AI is better than ever. I doubt that it will ever replace humans in our time as some believe, but it will definitely help us solve specific problems better and faster than we could without it. IBM is looking for real-world applications for Watson, and there are probably plenty out there in business, medicine, law, and the military (see “Watson, the Jeopardy! Champion Computer, Can Help You Too” at http://electronicdesign.com/article/embedded/Watson-The-i-Jeopardy-i-Cha...). The biggest problem is building the knowledge base by extracting information from existing sources and, specifically, experts in the field.

As for the implications for design engineers, it is harder to say. One technique that I recall having some application in embedded control uses fuzzy logic, a quasi (or is that pseudo?) AI technique to make decisions using logic. I am not that familiar with it but Bob Pease is, as some of his past columns reveal. (For example, see http://electronicdesign.com/article/articles/what-s-all-this-fuzzy-logic-stuff-anyhow-part-iv-4.aspx.)