OUTLOOK 2017: A solutions based industry

5 mins read

The challenges facing design engineers aren’t getting any easier, but the industry is working hard to help solve them

As cloud computing and the Internet of Things become part of everyday life, the data centre has, seemingly, become the engine of modern life. Whether it’s hosting the billions of videos uploaded onto YouTube, processing the billions of updates from Facebook users or ordering items from websites, data centres are ‘front and centre’ in function, if not visibility.

The server farms operated by Google, Facebook and the like are enormous undertakings. Between them all, millions of servers whirr away in remote locations, consuming vast amounts of data. Moving all that data from server to user is becoming more of a challenge as communications networks begin to creak under the pressure.

As Paul Pickle, president and COO of Microsemi, notes in his article in Outlook 2017, we are entering the Zettabyte Era. A zettabyte is, of course, 1021bytes and it is expected that, by the end of this decade, more than 2Zbyte of data will pass across the world’s communications networks every year.

It’s 50 years since Charles Kao, the so called ‘father of fibre optics’, and George Hockham proposed the use of glass fibres for communication. Kao’s work on developing fibre optic communications technology won him the 1999 Nobel Prize in Physics.

In the early days of fibre optics, the premise was that traffic could grow endlessly because the capacity was, effectively, infinite. But traffic is now growing more quickly than capacity and we are heading for a so called ‘capacity crunch’.

“In the early days of fibre optics, the premise was that traffic could grow endlessly because the capacity was, effectively, infinite. But traffic is now growing more quickly than capacity and we are heading for a so called ‘capacity crunch’.”

Graham Pitcher

What’s to be done? One solution is to put more fibre in place, but that costs money and those in the UK’s rural communities will be well aware of how long it takes to bring them high speed data communications. A more likely solution is to enable fibres to carry more data.

Where researchers thought it might be possible to transmit data over fibre at 1Gbit/s; today, researchers are pushing data rates beyond 1Tbit/s under lab conditions.

How is all this data being stored? Hard disk performance isn’t keeping up with demand and developers are now turning to flash based solid state disks to pick up the pace. Apart from performing better, flash based storage has lower latency, lower power consumption, better density and, importantly to those storing huge amounts of data, a lower cost of ownership. It won’t be long, apparently, before the first all flash data centre storage system is seen.

What about flash memory technology? Until recently, memory developers focused on shrinking cell size in order to get store more data in the same area of silicon. But flash memory faces significant challenges as process technologies shrink. The result is that companies like Toshiba have now started to focus on building upwards. Prototypes of this approach were demonstrated as early as 2007, but production versions have only recently become available. Apart from storing more data per unit area, the devices can also get information in and out more quickly using the Universal Flash Storage standard.

Now, Toshiba is planning to make available in 2017 flash memories with 64 layers and says these parts will meet the needs of a wide range of applications. On the road map is a 512Gbit part and Toshiba’s ambition is to make devices with more than 100 layers.

Another kind of communication is undergoing something of a revolution. Introduced in the mid 1990s, the USB port is ubiquitous. But while its original specification suited the needs of the day, today’s requirements are more demanding. Step forward USB Type-C, described by Intersil in this publication as a ‘perfect example of taking a good technology and making it even more useful’.

Backward compatible through auto negotiation, USB Type C connections can support more than 10Gbit/s as well as handling up to 100W – at up to 20V – bidirectionally.

Already a talking point, the autonomous car is set to become a regular sight on our streets in the next few years; Volvo and Uber have agreed to develop a fully autonomous car by 2021. Not surprisingly, the autonomous car will be a complex system, perhaps with more lines of code than a commercial aeroplane – at least in the opinion of Cadence. Designing these systems – integrating high levels of computing performance, high bandwidth networks, high definition displays, vision systems and sensor clusters – will need tools than can treat the car as a ‘system of systems’. And the systems themselves will integrate new approaches to deal with the complexity; according to Cadence, vision systems will need to adopt deep learning technologies, including convolutional neural networks, to better identify objects such as road signs.

These design tools will also bring another benefit; the ability to test the systems before they’ve been built and to make sure they interoperate as intended.

Developments will also be seen in the embedded world, as engineers take advantage of multicore devices; which could be multicore microcontrollers or multicore SoCs. But how will software be allocated to the cores? That’s a question which developers have been facing for the last few years; is the system static, in that one particular task runs on one core, or is it dynamic, where tasks run on a ‘first come, first serviced’ basis? One prerequisite is that everything runs in a way which guarantees safe behaviour.

Analogue technology continues to develop, with digital assistance making some tasks easier. Designers are looking for smaller devices which consume less power, are more accurate and faster; all of these at lower cost. But it’s a trade off between these parameters. Interestingly, device size is close to the top of the list. While this can be seen to be valid in consumer electronics applications, industrial designers are now being pushed to develop smaller products. The solutions come through a blend of integration, process technology and packaging innovation.

Analogue technology isn’t immune from the influence of the Internet of Things. Edge processing is likely to be an important feature of the IoT as sensors gather, then process, data. Analogue signal processing could be an important element and more capable analogue devices could take some of the load from the host processor. And those sensors will need more accuracy and lower power consumption.

Test and measurement companies are also responding to design challenges, looking to provide devices with more bandwidth, more functionality and new interfaces. ‘The goal is to deliver as much performance and functionality as possible in a compact package’, said one contributor.

One thing applies to all the technologies described here – knowledge. How does the busy design engineer keep up to date with modern developments? In friendlier times, engineers could read technical journals, such as New Electronics, or pick the brains of field applications engineers. Today, the internet provides a huge amount of information, but not always in one convenient place.

Could the distribution sector be poised to take on the role of information provider? High service specialist Mouser believes so. Distributors, it contends, can provide the knowledge, the learning and even the inspiration that will help engineers to identify best approaches, best technologies and best components for their projects. Alongside knowledge, companies such as Mouser are providing design tools that link to their databases, addressing Bill of Materials and purchasing issues.

As we ease towards the third decade of the Millennium, the challenges faced by electronic design engineers aren’t becoming any easier, but the industry is working to help solve them through innovative products, services and technologies.