Expect broader application of software-driven hardware verification in 2016

4 mins read

The system and semiconductor worlds are in transition. In the past, the focus in verification used to be on bug identification. Today, the electronic design industry is seeing a shift towards greater efficiency in finding bug root causes and in bug remediation. To that end, SoC providers have been looking to provide software with their products. While this higher value speeds design in time, it has the tendency to increase verification complexity and moves the responsibility of hardware-software verification to the SoC provider. As a result, in 2016, we can expect to see broad application of software-driven hardware verification methodologies.

Some system companies are designing their own semiconductors – in fact, all the leading smartphone companies design their own application processors. At the same time, semiconductor companies are having to create a large part of the software stack for each SoC since software and silicon are related intimately. These trends mean that software and the SoC need to be designed in parallel.

But this is not the only change we are seeing. There is more parallelism – from the inter-relation of thermal and power to packaging and EMIR analysis, system architecture and test strategy and more.

This highly concurrent design process involves what Cadence calls System Design Enablement, or SDE. Ultimately, SDE is about the convergence of the electrical, software and mechanical domains, resulting in an entire end product – the system. Taking such a holistic view accounts for the fact that software now represents the greatest cost and the biggest bottleneck in SoC design. And, because SoCs play an increasingly important role in many electronic systems, it has become vital to ensure that every part of the system, from chip to package to board, is optimised and verified.

Embedded software development and hardware/software verification must begin earlier. An SDE toolsuite and flow can support the increased role of software by providing pre-silicon development platforms for hardware/software codesign and coverification, virtual platforms, emulation, simulation and FPGA based prototyping.

For example, an SoC intended for a smartphone has to run Android (with one obvious exception). It doesn’t matter whether a smartphone company is designing its own chips or an SoC company which sells standard products. The requirements are very similar in either case and Android simply has to run on the chip. No company designing such a chip is going to tape out the design without first running a simulation of Android on a model of the chip. This is not just to ensure the software runs – other characteristics, such as the effectiveness of the SoC’s power architecture or the thermal effects in different modes (making a call, listening to an MP3 file), also need to be measured. This is software driven hardware verification, but it is not specific to the smartphone example. Chips for automotive, vision and many IoT devices have large software stacks and provide the most basic function of the SoC. But before anything else needs to be considered, you need to ‘run the software’.

It is not a surprise that system companies which do their own semiconductor design have more software engineers than semiconductor designers. But the same is true of semiconductor companies that do SoC designs.

Verification challenges

Verification always requires a multi-faceted approach. Software-driven hardware verification, in fact, can only be used relatively late in the design cycle, when enough of the design has been completed for the software to run. Earlier, at the block level, verification can be done with simulation and verification IP, or with formal techniques, or even with FPGA prototyping. But the software needs to be run when the design is approaching tapeout and most of the blocks exist.

However, there is a major challenge. Booting Android, let alone running any application software once it is booted, requires billions of vectors. The SoC on which the software has to run may itself consist of billions of gates. This makes verification time-consuming and complex – but it has to be done. The cost – in terms of time and money – is far too great to risk taping out an SoC in which all blocks have been verified, but the ultimate system verification of running the software has not been performed.

There are two key technologies for software driven hardware verification. The first is emulation. Emulators are relatively expensive, but provide value in terms of efficient and effective software driven hardware verification. Over the years, emulation has sometimes been found to be the weak link because of the lack of flexibility and the difficulty of getting a design into the system. A decade ago, this could have taken many months, but now the landscape is different. Emulation tools can now accept anything that RTL simulation can accept and compile it in a matter of hours.

The second key technology for software based hardware verification is virtual platform technology. This allows code written for one microprocessor to run on a normal architecture core. The binary of the software load runs on a ‘model’ of the processor. The word ‘model’ is in quotes because, under the hood, the microprocessor instructions are compiled on the fly. This so called just in time compilation is similar to the way in which Java interpreters work. In fact, the compilation process sometimes only takes place the second time an instruction sequence is seen, since so much code during system boot is executed only once and doesn’t justify the cost of compilation versus simply interpreting the instructions.

Virtual platforms are used because they are much faster and simpler than running a full RTL model of the processor. Even if an RTL model of the microprocessor is available, it may not need all the detailed information to verify the software application – using emulation may not be the best approach, even with the enormous throughput of an emulator.
These two approaches work hand in hand in what is sometimes called hybrid verification. The code binary – Android, let’s say – runs on the virtual platform, while the rest of the design can be compiled from RTL and then run on the emulation platform. The two parts are linked together automatically so that, for example, when the code for a device driver accesses its corresponding device, the RTL in the emulation platform sees the vectors.

Fig 2: Verifying a design now calls for an array of technologies covering all aspects of the SoC.

More important

The software development team for an SoC faces a similar problem – checking the software they write runs on the SoC before it is available. Software-based verification gives them a platform on which to test that their code runs correctly in parallel to SoC development and potentially co-optimise key performance bottlenecks.

Software based verification is not new, of course. But 2016 is going to be the year where it becomes more important. Since the software component of a system grows rapidly, so too does the requirement for ensuring that the SoC runs pre existing software before it is available.

Author profile:
Chi-Ping Hsu is chief strategy officer for EDA products and technologies at Cadence Design Systems.