New Electronics roundtable: The issues involved in embedded software development

4 mins read

Developing software for embedded systems isn't getting any easier. As systems offer more functions, code size grows and checking that code becomes a challenge. Software is also being developed for multiple processor platforms and for different operating systems. Meanwhile, security is growing in importance, particularly as more systems include wireless connectivity.

New Electronics convened a roundtable to explore some of the trends in the embedded software market. It could be argued that projects start with the operating system. A recent New Electronics survey showed 43% of respondents use a proprietary OS. Colin Walls, a technical marketer with Mentor Graphics, asked: "What happens when they spec an OS?" he asked. "There are three options: commercial; open source; or proprietary. Proprietary is surprisingly common and there are many reasons why people take that route. But it's not wise." Having considered whether the applications is real time or not – 'it's a fundamental question', said Walls – it's time to think about memory footprint. "A lot of people are concerned about this. Once the board's done, you're stuck with a given amount of memory and there are many situations where you need to keep the footprint down – you might want to put the kernel on chip or lock it into cache." There are also commercial considerations. "Cost is always an issue," he said. "But managers often confuse cost with price. Looking at the price tag isn't always the best approach; you need to think about the cost of ownership." What does this mean? "It's like buying a car; you also need to factor in servicing and running costs." There's also the cost of experience. "People may have to learn to program a new OS – that's expensive – and it's the same with legacy code. This is likely to be locked to an OS to some extent and moving that is another expense." The New Electronics survey showed 28% of respondents had used Linux. "People might think it's free, but it's not by any stretch. Acquiring the code might be free; deploying it isn't," Walls claimed. There are, said Walls, a number of technical reasons why Linux isn't the best embedded OS. "Memory is one. Linux has a large footprint and it's not scalable, with optional components. It's all or nothing." In some systems, this won't matter, he continued. "But with an mcu, you want memory size to be sensible and Linux will need a memory management unit, which is an overhead you don't want in smaller systems." Having sorted the OS, engineers need to write code. We asked what standards they wrote code to. The result was surprising: 60% of responders didn't write to a standard. Colin Downey, managing director of tools distributor Reflex Technology, was not surprised. "Most people want to get on with writing code, but they aren't interested in checking it. But there are tools available and they free the engineer to do a proper job." One of the 'proper jobs' is checking code quality, but they should also look at coverage. "It's often a surprise to engineers that finding bugs saves money. If a product gets to market and you find problems, it's expensive and it affects the company's reputation. Testing code while it's being written is the way to go." But what standards should you test against? "IEC61508 has been around for a long time," said Downey, "and there are many standards based on it. One of the things I like about 61508 is that it says you can't have zero risk. Instead, it uses the concept of ALARP – as low as reasonably possible." Downey pointed to four areas where software quality can be improved. "Source code review, static code analysis, unit test, and coverage," he said. "Source code reviews are one of the most useful and low cost ways to get good code. What's shocking is that most companies don't do this – even two people can have a peer review and it's amazing what you learn. "Static code analysis pushes the code to meet certain standards. The most common is MISRA, which has about 140 rules which have to be met in order to avoid overly complex code. "There are now tools which will parse code and produce tests. Before, developers would write code, pass it to the test department, who would write tests. But it's hard to write tests for code you haven't seen before and it's too late, because the problems are already there. "Meanwhile, there is a misconception that code coverage is a test – it isn't, it's a way of testing that tests are running as they should. Code coverage tools tell you whether the tests have fully implemented the code." Alex Wilson, senior aerospace and defence business development manager with Wind River, pointed to the challenges in developing software for modern aircraft. "The systems are so complex that standards don't cope with testing. While the system might be safe to DO-178C, there is no mechanism to see how it works with other systems; you can find multiple system failures cascading. It's only now the sector is thinking about how to certify complex systems." Wilson noted there are more standards for aerospace, so companies have to use them. "But if you look at military, you see commercial standards being adopted and code tested more strictly. The medical and industrial sectors are well behind." Walls pointed out that MISRA is being repositioned to appear less motor industry specific. "What it is doing is good for other industries," he claimed, "and it will be broadminded for companies to adopt it." Wilson is particularly interested in connectivity issues. "Systems we think of as cloud based are being connected. For that to happen, we have to follow open standards otherwise interoperability will not be guaranteed." He said that, while standards such as DO-178C had clarified some issues, there isn't an equivalent standard for security. "The Common Criteria classification is vaguely IT based and it's only recognised in 26 countries," he noted. "How do you produce a device that is secure and which can be proved to be secure?" The issue with security in embedded devices, he said, is to prove the code running on the system is the code which was written for it. "How do you know the person sending you data is who they say they are?" he wondered. Designers have to ask how to maintain security. "Where is the gateway?, where is the physical security? With the cloud, you don't know where the functionality is – where is data stored, for example?" The last point is important because different countries have different data security regulations. "You need to know is someone might be messing with your data," he noted. Addressing the current interest in M2M and the 'internet of things', Wilson said: "If you're building an M2M system, you need to think about the whole architecture." Wilson highlighted USB ports as a major problem "Go back before Stuxnet and people thought USB was not a problem. Stuxnet proved systems aren't secure and now people are more reluctant to connect because they open their systems up to problems," he concluded.