Outlook 2013: Rack and stack takes a back seat

4 min read

Over the last several decades, the basic model of instrumentation has remained largely unchanged. Engineers and scientists have typically selected fixed function hardware and used software, such as NI LabVIEW, on a PC to control the instrument. The PC and bus that connects them have taken on many guises, ranging from a desktop PC controlling instruments via LAN, serial or GPIB, to a complete modular approach (such as the open industry standard PXI platform) that is optimised for automated test and measurement.

But even modular instruments have much of their functionality defined in their embedded firmware by the vendor; there are no means for you to change this firmware to suit your specific application. However, the notion of fixed function hardware is becoming outmoded. Just ask mobile phone providers struggling to rapidly adapt to software based smartphones. Customers are demanding the ability to tailor their phones' functionality to match specific, individual needs using software running within these devices for more control. So why should test equipment be any different? The next generation of instrumentation technology is making it possible for engineers to redesign software running on the instrument itself – the firmware. This approach, known as software designed instrumentation, takes the focus away from what the vendor thinks the instrument should be used for and places the emphasis on what the user really needs it to do. To give an example, let's look at a typical instrument – the spectrum analyser. The traditional spectrum analyser uses a swept tuned approach to measurements: the local oscillator is swept through the band of interest and the resulting intermediate frequency signal goes through an analogue resolution bandwidth filter. This increases test times and reduces flexibility. Traditionally, if you wanted to scale the capability of the spectrum analyser for better standards coverage, you focused on the vendor. Much of the instrument's functionality was effectively determined through the vendor's choice of analogue component and fixed signal processing to accommodate the greatest number of use cases when the instrument was developed. This is an inflexible paradigm that is difficult to scale to emerging new wireless standards. A more flexible approach is to use a digital signal analyser, which makes use of internal processing components like FPGAs to perform RF algorithms inside the data stream of the instrument. Digital up conversion, digital down conversion and fractional resampling are examples of techniques for which FPGAs have become essential to achieve cost effective test times. Because a test system's signal processing needs are likely to evolve over time with new standards, improved algorithms or changes to schemes specific to the device under test (DUT), the ability to modify the code inside the instrument's FPGA can empower a test system operator to keep pace without scrapping hardware. Access to the FPGA is also crucial in transceiver systems for which decision making may need to take place quickly between the receipt of information and the generation of a response with very low latency. To give one example, testing devices like RFID tags requires short, closed loop response times; access to FPGAs in the data stream provides test engineers a viable approach to address this. This is also true with 'protocol aware' systems, for which asynchronous serial protocol communication is required for test. By building the specifics of the protocol into the FPGA, you can abstract the programming to a high enough level to focus on the information being transmitted or received instead of concentrating on the low level details of the protocol communication. This can make measurement systems much more code-modular as protocols change or evolve. Simply providing the ability to program the FPGA, however, is not enough for this capability to be useful. The programming language must be accessible and productive enough for RF test engineers, which has often been a barrier when limited to hardware description languages like VHDL and Verilog, which are often the sole preserve of embedded designers. The LabVIEW FPGA graphical programming toolchain is crucial to bridging this gap, as it allows engineers to use the same tool to program their instrument firmware as they would to control and automate their measurement hardware. The first example of a truly software designed instrument is National Instruments' PXIe-5644R Vector Signal Transceiver (VST), which is software centric from the ground up. The module incorporates a vector signal generator and analyser alongside high speed digital I/O, but the key component is a user programmable FPGA. This pushes software as close as possible to the point where RF signals are converted to bits, and allows the user to completely customise the open source firmware to their needs. An example of an organisation putting the software designed approach into action is Qualcomm Atheros, which has achieved a 200 times increase in test speed over its previous 'rack and stack' system. Using the VST, it can control the digital interface to the DUT simultaneously alongside vector signal generation and analysis. Despite the rack and stack solution being fully automated, it could only characterise 30 to 40 channel settings at any given time, using an iterative estimation approach to determine the best possible gain settings for the device. Switching to software designed instrumentation led to such great improvements in characterisation time that complete gain table sweeps can now be performed for a device in one insertion, removing the iterative aspect. This allows optimum gain settings to be determined empirically, allowing additional operational modes, since the performance of the radios can be better understood. Qualcomm Atheros' Director of Engineering Doug Johnson stated: "Instrumentation flexibility and 'to the pin' control are critical for keeping our RF test process as efficient as possible and we're pleased with the performance gains we've seen when testing with NI's new vector signal transceiver. The NI PXIe-5644R ... has improved our test throughput significantly." As WLAN standards continue to evolve and become more complex, they require more measurements. Consequently, total test time increases. To combat this, measurement speed must increase. In 2007, Qualcomm Atheros began testing of 802.11n enabled devices and switched to modular PXI based instrumentation, leading to an initial 10x improvement in test time. The move to software designed instruments for testing 802.11ac in 2012 resulted in 20x improvements. DUT control is just one of the many applications that can be enhanced by software designed instrumentation. The low latency and inline processing offered by FPGAs make applications like real time channel emulation and power level servoing obvious candidates for software designed instruments, but the possibilities are really only limited by the user. It's similar to the advance of user empowered smartphones; we could not have imagined the extent to which the software centric architecture has now transformed the phone from a simple communication device to an essential item that solves a diverse range of our daily problems. The beauty of user empowered instrumentation is that, for every application we can think of right now, engineers will continue to find new challenges that can be solved with this flexible approach. How will your perceptions of instrumentation change once software designed instruments become mainstream? National Instruments Since 1976, National Instruments has equipped engineers and scientists with tools that accelerate productivity, innovation, and discovery. NI's graphical system design approach provides an integrated software and hardware platform that simplifies development of any system that needs measurement and control. Engineers and scientists use this platform from design to production in multiple industries, advanced research, and academia. The company's long term vision and focus on improving society through its technology has led to strong, consistent company growth and success of its customers, employees, suppliers and shareholders.