Reconfigurable Instruments Respond To Custom Demands
The world is becoming more customized. We’re no longer content with a few choices of car models, computers, or cell phones. We want personalized features that can change and adapt to our needs.
Engineers have responded by creating devices with more intelligence and software so they can be rapidly customized and reconfigured. Think about your mobile phone. It used to be a fixed-functionality device that you “updated” every couple of years. Now, it’s a handheld computer that runs a set of “apps” that you select and change as often as you like.
This endless reconfigurability has created a challenge for test engineers. New features come fast and are difficult to predict. Therefore, a test system that is too rigid rapidly becomes obsolete. To keep up with the needs of modern automated test systems, engineers are employing reconfigurable instruments—measurement devices whose functionality can be reprogrammed to redefine the instrument as needs change.
For years, instruments, particularly modular instruments, have been customized through software running on a host computer. This method is still ideal for many automated test applications in use today. The need for customization and high performance, though, is pushing the reconfigurability down to the hardware to achieve the required performance.
One example of this is testing a modern RF transceiver, where coding/decoding, modulation/demodulation, packing/unpacking, and other data-intensive tasks may need to occur in a defined time cycle to properly test the device under test (DUT). In these cases, the software-defined architecture needs to be flexible enough to incorporate user-programmable hardware to place the necessary intelligence inside the instrument. These reconfigurable instruments are based on an architecture where data can be acted upon in real time on the instrument and/or processed centrally by the host processor.
The Role Of FPGAs
FPGAs are the key enabling technology to this architecture because they combine the performance of dedicated hardware with the programmability of processor-based systems. FPGAs are silicon chips with prebuilt logic blocks and programmable routing resources that engineers can configure to implement custom hardware functionality. In addition, FPGAs are completely reconfigurable and instantly take on a new personality when a new configuration is compiled and downloaded.
Beyond being user-programmable, FPGAs offer hardware-timed execution speed as well as high determinism and reliability. Their architecture is truly parallel so different processing operations do not have to compete for the same resources. Each independent processing task has its own dedicated section of the chip, and each task can function autonomously. As a result, adding more processing does not affect the performance of another part of the application.
While FPGAs have been used inside instruments for more than a decade, test engineers were seldom given access to embed their own algorithms on them. To be useful for test engineers, these FPGAs must be reprogrammable by the engineer in software. In other words, they should be used to push test software programmability down into the hardware itself.
In the past, FPGA technology was available only to engineers with a deep understanding of digital hardware design software, such as hardware description languages like Verilog or VHDL, which use low-level syntax to describe hardware behavior. Most test engineers are not experts with these tools.
However, the rise of high-level design tools is changing the rules of FPGA programming, with new technologies that convert graphical block diagrams or even C code into digital hardware circuitry. These system-level tools that abstract the details of FPGA programming can bridge this gap.
Clearly, there are advantages to performing different types of processing on a host processor versus an FPGA. For example, an FPGA is generally well-suited for inline analysis such as digital filtering and decimation, whereas complex modulation or image processing will typically achieve better performance running on a multicore processor due to the large number of floating-point calculations required.
The ideal solution for developing a software-defined test system is a single system design environment that provides the ability to quickly partition the processing on the host or an FPGA to see which offers superior performance.
Reconfigurable instruments also can meet application challenges that are impossible to solve with traditional methods. Devices such as RFID tags, complex systems-on-a-chip (SoCs), and engine control units (ECUs) often require the test system to emulate the environment around the DUT, including complex timing relationships. This timing and real-time response can only be accomplished in hardware.
For other applications, reconfigurable instruments will enable test engineers to look for creative ways to reduce test time and system cost. Take, for example, a digitizer that has an FPGA in line with an analog-to-digital converter. An engineer can deploy functions to the FPGA such as filtering, peak detection, fast Fourier transforms (FFTs), or custom triggering. An FPGA-based digitizer can quickly decide which data is worthless and can be discarded and which data has value, substantially reducing measurement time.
The consumer’s quest for customization won’t wane. Rather, more devices will use embedded software to define, and redefine, their functionality. Fortunately, reconfigurable instruments will enable test engineers to keep up with these requirements and ensure high-quality, and well tested, devices.