Terms like machine learning (ML), artificial intelligence (AI), neural networks, and deep learning abound in articles and application descriptions these days. Dreams of human-like robots and self-driving cars continue to emerge, and we’re closer to those now than ever before. However, the former is still a pipe dream and even the latter is farther off than most people would like. Too many assume that the ability to recognize people, gestures, or sentences indicate that cognizant AI is just around the corner.
On the flip side, the past and current AI/ML advances have turned out to be nothing short of astounding. Hardware improvements are the main drivers, but software advances have been even more important. Actually, the parallel nature of many of these improvements has proven beneficial, because single-core processor technology has effectively hit a power barrier.
Like any new technology, lots of questions arise as one learns what technologies are now available, what they can do, how they work, and how they can be incorporated into an application. The challenge with the latest AI/ML solutions is the extremely broad range of what’s available. A chart covering this technology looks more like a Banyan tree.
Still, the benefits and applications are significant. Likewise, the hardware and software that can be employed ranges from something running on a standard 8-bit microcontroller to a custom AI/ML chip running on hundreds of drive bays in the cloud. Of course, that little micro might only be working on predictive maintenance of a motor controller while a hardware-accelerated system might be analyzing multiple video streams.
Another challenging aspect with respect to AI/ML is the technology’s perpetual expansive growth. It’s not uncommon for a new version of a machine-language model compiler to double the performance of an application with no change to the model.
Most applications will not benefit from AI/ML, and it’s not a great idea to force-fit a fancy new technology into an established application. On the other hand, taking a new look at an application with an understanding of the various AI/ML methodologies can reveal ways to provide significant enhancements to an application.
It’s worthwhile learning more about AI/ML to at least identify hardware and software that may be useful in developing an application. It takes significant effort to get to the point where the technology can be incorporated in an application, though, including determining the advantages to be attained as well as the costs involved.