The starting point for system design is always supposed to have been that legendary ‘blank sheet of paper’. Affixed to a wall, a line was drawn down the middle of the paper and the words ‘hardware’ and ‘software’ written on either side.

Whether this has ever been the case is a matter of some debate and the arrival of the reference design, for example, has removed a lot of the partitioning headaches that engineers once wrestled with.

But while the reference design might have moved the starting point a bit further forward, there are still some serious technical questions to be answered along the route from design to production.

Pete Lomas, director of engineering with Norcott Technologies, is no stranger to system design. Over the last 30 years or so, he has developed more than 100 electronic products, with applications ranging from hydrographic research to x-ray tomography.

He pointed to a number of emerging issues that affect how a design is put together. “One of the first things to note is that the expectations of those who use instrumentation and electronic equipment have been raised by the functionality of what they have in their homes and hands.”

And he believes the IoT has made the determination of an optimal system architecture a far more complex process. “You might have had sensors, actuator interfaces and the associated control logic in one ‘module’,” he noted. “The IoT model potentially disrupts that. Will the sensors and actuators be connected directly to the controller or will you use something like Bluetooth LE or Wi-Fi to create separate modules at the point of use?.”

Lomas is particularly interested in man/machine interface (MMI) issues. “This has been hugely influenced by devices like the iPhone, to the extent that many young people are surprised when equipment hasn’t got a touch interface. People now want interfaces to be intuitive; to be simple on the surface.

“But that’s a challenge. When I’m using a scope, I want to change the gain by turning a knob while still looking at the waveforms. Adjusting trigger levels ‘today’ is much easier via a physical knob. With touchscreens, this could be a problem and haptic feedback may hold the solution.”

So he contended that MMI design is now a matter of managing user expectations. “Should I even bother designing in an MMI, with its inherent cost?” he asked. “Why shouldn’t I just put a light on the product saying ‘connected’? You can then have the MMI as an app on a phone or a tablet or as a web service, but that means having to support a range of devices. There are arguments for and against here.”

Communication considerations

Suggesting a link to mobile devices raises the issue of how the device being designed will communicate. “You could use Wi-Fi to connect to anything that sits on the internet and that brings interesting architectural possibilities,” he said. “If you can guarantee that your kit will always be connected to the cloud, then it raises the possibility that you can perform complex processing on a server farm somewhere. But that has to be balanced with the fact that you’ve lost autonomous operation.”

That balance requires a decision on how frequently processing needs to be carried out. “If your design is only processing data occasionally, then a server farm might be a sensible decision. If, however, data is being crunched all the time, then processing has to be local.”

Deciding to take that jump to processing on the cloud has a silver lining, Lomas said. “If you can migrate your business model to a service model, you can use significantly less hardware and software in the box, while new features and facilities are easier to roll out.”

But looming over decisions such as these is the ‘dark cloud of security’. Lomas noted: “Unless you have paid significant attention to how your internet communications are going to be secured, then you are at risk of being hacked; and that has financial implications for everyone. It’s a problem that I think will get worse before it gets better.”

Why care about security?

Underlining the importance of security, Lomas said there was a temptation to cut corners. “Who cares about internet security on a connected fridge?,” he asked. “You have to, because someone could call a manufacturer and say they’re going to ‘kill’ every model it’s made unless they pay up.

“The internet scales risk. Security has always been a cost/reward model in that the cost of acquiring the asset (or reward) has to be matched to its value. Manufacturers now have to think about everything they have made and consider that as a single unit.”

What about maintenance or upgrades? “It should be easier for a cloud based product,” Lomas said, “but what if that product needs to be certified: a bench top instrument, for example? If the manufacturer does a software upgrade, does it then need to be recertified? And if it’s working and not giving problems, why would I want to upgrade? Would I be happier with the faults and quirks I know about?

“There is a benefit to stability and, in particular, not having to fix something every day.”

The IoT will require a layered approach to safety, security and functionality – and these will be local. Maintenance, diagnostics and data could then be on the cloud. “That’s interesting, because it could simplify things,” Lomas contended. “However, we need to find our way through these issues and come up with innovative products.”

Blurring the boundaries

All of this requires a careful choice of processor and operating system.

“There is a whole range of things to think about in terms of chips, circuits and architecture,” he noted.

Map out carefully what the product is intended to achieve, Lomas advises. How does data need to be transformed, for example, and how can you map that to hardware and software cost effectively?

“In earlier times,” he recalled, “there was little or no choice; you either did it in hardware or software on a local processor. Today, devices such as integrated processors, DSPs and FPGAs blur that boundary. In many cases, it is cheaper to handle I/O using an additional processor than custom hardware.

“Also you get to corner cases – for example, where there is a lot of data or high data rates or computational intensity. Then you need to choose your architecture more carefully to achieve that hard/soft balance.”

Other contributions will come from asking how many will you make and what’s the market? “For low volumes, you’ll probably select known and trusted processing blocks for their low development cost and risk. If you’re trying to shave the last cent from the BoM,” he said, “then it’s time to invest in production engineering.”

Looking back, Lomas said system architecture has moved from designing the electronics being all of the problem to it now being a small part. “Nowadays, the software infrastructure is the biggest issue and the hardest to control. But good engineering at all levels will always pay off in the long run,” he concluded.