I remember when FPGAs were specialized devices requiring custom programming that could only be generated by wizards. They were often hidden inside radar systems or other high-performance platforms where the cost of the devices and programming was warranted. These days, that’s all changed.
Low-cost, flash-based FPGAs are in all sorts of embedded devices, and high-performance FPGAs are turning network interface cards (NICs) and solid-state drives (SSDs) into programmable computing platforms. It was once the realm of ASICs, where high volume made these practical for applications like encrypted disk drives. This assumes standard software interfaces with sufficient adoption to make the solutions practical.
Nowadays, FPGAs are mainstream. FPGA boards are found in the cloud alongside GPGPU boards and machine-learning/artificial-intelligence (ML/AI) accelerators. FPGAs can provide such support, but custom chips are often better for optimized ML/AI support. FPGAs offer a more flexible approach, allowing more than just ML/AI acceleration to be incorporated into hardware. Standardizing the APIs and driver interfaces has changed the way people view FPGAs, GPGPUs, and ML/AI support in the cloud.
The same is happening at the peripheral level. SmartNICs with bundled FPGAs accelerate and offload network processing chores from the host. This helps minimize system bandwidth requirements as well. It can also help keep data moving at wire speeds, where a host might otherwise become overwhelmed.
Xilinx’s SmartSSD computational storage device (CSD) is an example of how FPGAs are playing a role in storage (see figure). It’s not the first FPGA/SSD device on the market, but the push for use with standards such as the PCI Express-based (PCIe) NVMe make it a very interesting platform.
Like SmartNICs, SmartSSDs can utilize the FPGA to implement a variety of features that might otherwise need to be handled by a host processor. For example, a SmartSSD may be programmed to handle data compression and encryption. It could also turn the basic SSD into a content-addressable memory or even an ML/AI engine. Applications like real-time multimedia transcoding fit in both SmartNIC and SmartSSD arenas.
Moving computational chores closer to the peripherals makes sense, especially as data centers become disaggregated. It also makes sense in embedded applications, where functionality can be distributed to facilitate development and broaden modularity.
Keep an eye out for other peripherals, from cameras to motor control, to come with much more intelligence than in the past.