There’s a lot more to embedded software security than simply writing ‘bomb proof’ code

5 mins read

Security was the buzzword of 2015, particularly when it came to the Internet of Things (IoT). The industry came to realise that billions of devices with communications capability presented something of a threat and that ‘something needed to be done’.

If you are in any doubt, consider the recent Blackhat conference in Las Vegas. The global IT security event had sessions specifically addressing how to hack IoT applications and how to hack ARM based systems. Both sessions were sold out, with attendees eager to learn the latest defence techniques.

An accusing finger has pointed towards embedded software as a potential weak point. But, according to a couple of industry specialists, it’s not the software that’s the problem; rather, it’s the people who write it. It’s time, they agreed, for a bit of education.

Colin Walls, an embedded software technologist with Mentor Graphics, said: “I’m worried about use of the term ‘security’, because it can mean one of several things; all of which are important. For example, it can mean protecting data you’re transmitting that you don’t want people to see. It can also mean preventing people getting into systems. Then there’s making systems safe.”

But Walls counselled: “Safety and security are not the same thing; safety is protecting the world, while security is vice versa.”

Niall Cooling, CEO of Feabhas Software, added: “Security isn’t just about software; it’s the system. There are a lot of parallels with safety and you can’t isolate one element. Security needs to be end to end.”

Both agree the embedded systems world has overlooked the need for software security for some time. “How many systems would have been connected only a few years ago?” Walls asked. “Probably only about 10%. Today, it’s more like 95%.” The problem, he pointed out, is often that connectivity is added to a system that hasn’t been designed with connectivity in mind. “As soon as you introduce connectivity, all aspects become a concern because people connect things without thinking about security.”

Cooling pointed out that a lot of embedded systems don’t have user names or passwords. “They tend to be closed and aren’t configurable,” he said. “If you want to configure something, you have to interact with it and that information is often sent over a network as ASCII characters. It’s one way people can get into a system and we, as an industry, will have to do more string management in the future.”

He believes the lack of string management has been one of the biggest exploits, but noted there are other issues. “You can attack software by forcing overflows, for example. But one of the questions that needs to be asked is whether we’re using the right language. While the use of C won’t change, it’s not secure.”

Continuing, Cooling contended the software security discussion has started from the wrong point. “We’re trying to secure systems using a language that isn’t inherently secure.”

One of the reasons why C and C++ continue to prevail is cost. Walls said: “The drivers for embedded systems development have been cost, design cycle time and power consumption. We look to make the code as small as possible so it can run on the cheapest device. Making devices secure has thrown this process a ‘curve ball’ because cryptography is expensive and power hungry.”

Languages are also an issue, Cooling suggested. “C and C++ are efficient, but it’s all down to how much flash?, which processor? The bill of materials plays a big part.”

It’s here the experts agree on the need for education. Walls pointed out that, in his experience, embedded engineers are conservative. “They tend not to do things differently; usually for a good reason.”

Cooling agreed: “Software is not the main background for embedded system developers. They may be domain specialists, but software is only part of their job and they do what they’re comfortable with.”

So what advice can the two experts offer? “Everything is important at the moment,” Walls observed. “Everyone is worried about IoT security – and that comes down to people making decisions they shouldn’t have. But, in the big scheme of things, there’s nothing more important than data security – connectivity has crept up on people and they connect things without thinking about security.”

Cooling expanded. “You can ride off the back of traditional communications approaches, such as TCP/IP, but there are things like IPv6 and 6LoWPAN coming through; people need to address these. Beyond that, there’s the platform. What are you programming? Then there’s hardware security, the operating system and the apps using that OS. Systems will need to support randomisation and, even though you can have well written apps, if someone can see the memory through a JTAG port, everything crumbles.”

Walls agreed. “You can’t look at security in isolation; you’re building a system. But if you do want a particular level of security, you might want to use multiple cores, which gives the opportunity to segment the applications and to put a different OS on each core. Also think about using a hypervisor as the overarching control software for multiple cores. You can also accommodate legacy systems and control the ways in which they communicate.”

“Encryption is an obvious approach,” he continued, “and if you use the latest secure versions of protocols, you should be in good shape.”

Cooling agreed that hypervisors made ‘a huge amount of sense’. “ARM’s TrustZone technology on Cortex-M based processors will be massive for the IoT. It will be central for the future and will eliminate a raft of potential problems – it’ll be a game changer.”

Beyond that, Walls pointed to mechanisms that might protect against a software overflow. “Someone trying to take advantage of an overflow would hit a buffer,” he pointed out. “And you can always build in a self test routine that can recognise if you have an overflow before you find out about it. Self testing code is something that not everyone has thought about and might only require a couple of hours’ work.”

In Cooling’s opinion, there is a need for more rigour, or what he called ‘enforced automation’. “There are basic things like static analysis tools and I’m shocked by how many companies don’t use this approach. It will solve many issues and help to fix silly mistakes. People also need to understand how to test for construction and for coverage.”

“There are a lot of parallels with the safety world, but safety has a lot of compliance requirements and standards like IEC26262. There’s nothing like that in the commercial world. Standards make you write code properly and test it properly. It stops unwarranted behaviour and developers must learn best practice quickly.”

There is also the need to design for the future. Walls said: “Make the assumption that there are faults in the software you’ve written.” Cooling agreed: “It’s not just about security today; it’s about how you maintain security going forward. We have to build systems on the basis that they will become insecure in the future and need software upgrades,” he concluded.

Keeping on top of integer types

Integer types in C can be confusing, but the core types are:

  • char
  • short
  • int
  • long
  • long long

Each type can be unsigned or signed.

Problems with integers occur in a number of ways, including:

  • overflow
  • underflow
  • promotion/extension
  • demotion/narrowing
  • sign conversion

The most common root problem using integer based attacks is where the implementation of an algorithm has mixed signed and unsigned values. Targets include standard library functions, such as malloc or memcpy, which both take parameters of type size_t.

In this code example, if the attacker can craft copySize so that it is a negative number, then the test is true

int copySize;

// do work, copySize calculated…

if (copySize > MAX_BUF_SZ) {

return -1;

}

memcpy(&d, &s, copySize*sizeof(type));

The output shows that, by crafting the value of copySize, memcpy overflows the destination buffer (d) into the following memory (buffer c).

$ ./a.out -2147482047<

s[1024] 1712400 c[0] 0

About to copy 6404 bytes

s[1024] 1712400 c[0] 1712400

You can improve the quality of your software by:

  • reading more on the subject
  • using compiler flags
  • following a security based standard
  • enforcing the standard using static analysis tools