Electronic products, particularly in high tech sectors, have become more software intensive over the past five years*, largely through the need to increase product functionality.
For the most part, software is regarded as the means to develop smarter, more innovative and better differentiated products. However, the inherent increase in product and product development complexity that accompanies this trend is likely to impact system design strategy. Cambridge Design Partnership has 15 years of multidisciplinary project design experience in high tech markets, including the industrial, consumer, healthcare and cleantech sectors. James Baker, senior engineer, attributes this to the growing importance of user interfaces in electronic products. "We realised some while ago that end users paid less attention than before to hardware content. Today, it's all about features." He cited touchscreens, voice and gesture recognition. "For many products, the user interface is the product." CDP is committed to a systems approach to electronics development. "Hardware and software development are inseparable," Baker claimed. CDP adds user interface specialists to the mix. "Each decision by each discipline can have a major impact on another," he added. He uses the example of touchscreens, which have advanced tremendously in the last five years. "Giving the user the capability to 'throw graphics around' has increased the load on the hardware significantly," he said. "Now, we need better embedded hardware graphics accelerators!" While Baker agrees that many applications have become more software intensive, he often sees a corresponding impact on hardware. For example, processors have been falling in price dramatically, while performance increases. "A $5 processor from five years ago, will now cost around 50c. You can put a ridiculous amount of processing power into an application that probably doesn't need it. Performance is no longer driving cost," he said. This is leading to more widespread use of standard, processors and peripherals, something which can be a lot cheaper than using dedicated hardware. Development boards and reference designs are ideal for many applications, Baker believes. A similar trend is evident in what Baker calls firmware based design, with huge amounts of logic available in a standard fpga. "For just a few dollars, designers can obtain a reasonable level of performance with higher integration and higher performance in fewer parts than has been achievable before." With the hardware 'sorted', it's easy to see why the emphasis is now on software. Distributed display architecture A current CDP project provides an example of its system design philosopy. The project – a large led based billboard display for commercial video advertising – is described by Baker as 'groundbreaking'. "There were several very different approaches we could have taken for the system architecture." "The most critical aspect is to ensure the basic requirements are correct and agreed with the customer, right at the start," Baker stated. "It sounds obvious, and it might be a very simple specification, but it is fundamental." In this case, the product can be represented by a simple black box with a defined input (streamed video data) and output (a video stream that maps to a large display). The requirement for a scalable, cost effective design had a major influence on the system architecture. "The sheer volume of data involved in this application encouraged us to implement a strategy that limited the data bandwidth at any point in the system," Baker said. The result is a distributed system, with a network of stacked blocks of repetitive processing elements, which function in parallel to create a display of the required physical size. "Each block is small, robust and considerably lower cost than any alternative approach," Baker explained. It also ensures the design is scalable – an important factor for future developments. Figure 1 shows the layered structure of the processing tasks required. Data is typically captured from an internet source and stored on the network. Each node on the network performs its own processing before video is output to its designated display area. There are two distinct phases of tasks. Firstly, the system needs to capture, store and process data, then stream it to the display. Analysis showed that, to meet the performance targets, much of the image processing had to be done in hardware. Not only the hardware/software partitioning, but also the choice of hardware and software for each phase was influenced largely by the nature of the application. The team benchmarked how long it took to process some of the images the system would have to handle. Clearly, high performance processors were required for the complex filtering tasks in the early stages of image processing. "We are using standard processing hardware running standard OS based software," Baker confirmed. The second phase is the real time element, which has to use fully clocked deterministic logic. Baker explained that a standard processor architecture wouldn't be suitable. "Even though the processing is fast, we wouldn't know precisely when each task would finish." Although the partitioning appeared to be obvious, it required careful consideration. "You need to look at the positives and negatives of each option and plan the critical components of each before you decide," he advised. In this case, the technology used to implement the network nodes is deliberately not leading edge. "We can use low cost, robust, readily available and easily replaceable devices," Baker confirmed. "The innovation is in the system design." If cost had been an overriding constraint, CDP may have considered a fully custom design, especially if the end product was likely to go into high volume production. Typically, tight timescales might dictate a reuse policy. In Baker's view, design reuse is essential as it can help to meet time and cost constraints. "There are a lot of reusable software and hardware blocks out there." CDP brought in third party IP for the display project, including a Jpeg decoder which has been integrated into an fpga used in the second processing stage. "It was a good price and ready to go," he confirmed. Typically, reusable hardware might include IP blocks, asics or even boards. Splitting the functionality across two distinct hardware platforms has required additional software development effort, particularly for the real time elements. Baker is impressed with the increasing amount of readily available good quality open source code – CDP is a fan of embedded Linux. "This sector has changed radically," Baker reflected. "For the equivalent of one man day of effort – a huge reduction on what we would have needed even a short while ago – we acquired the capability we needed for this application." Once the 'snapshot' of the selected version of Linux has been captured, further effort is needed to maintain the code through the product's lifetime. "And a different level of expertise is required to do that today," he added. It might seem that increasing software content means increased product complexity. But, by looking critically at the basic requirements and ensuring a fully integrated hardware/software development team considers the options and the impact of all hardware/software choices, product quality, timescale and performance targets can be met effectively. *Tech-Clarity, Developing Software Intensive Products, 2012.