Tech predictions from Arm for 2021

4 mins read

Arm company experts offer their perspectives on the technologies and engineering concepts they believe will play an outsized role next year and into the coming decades.

Invisible AI

Artificial intelligence (AI) and machine learning (ML) gain ground when their complexity gets pushed into the background. Over 1.5 billion people enjoy ML algorithms when they take smartphone pictures (or subsequently search for them in their ever-expanding photo files) generally without knowing it. The same phenomenon occurs whenever someone issues a command to one of the estimated 353 million smart speakers deployed worldwide. Invisibility works, according to Jem Davies, Vice President, Arm Fellow and GM of Arm’s Machine Learning Group.

Expect to see the invisibility spur the adoption of many applications. One-click smart parking will likely be the first experience with autonomous cars for many. Security systems that can accurately differentiate between the sound of a nearby prowler and a wandering raccoon will attract consumers.

Invisibility, however, remains hard work. Improvements in CPUs, NPUs and GPUs will be required. AI processing will also have to shift to devices to save energy and cost, putting an emphasis on creative, elegant algorithms that minimize everything: storage, bandwidth, compute and power. We will also have to give consumers a much-needed sense of privacy and data autonomy: if we don’t give individuals a better way to control how AI impacts their lives, it could become the biggest roadblock of all.

Memory-centric computing architectures

“The basic rationale for the many flavors of memory-centric compute - integrating CPUs and GPUs directly into memory devices, high speed interconnects inside 3D chiplets, etc. - is that moving data takes as much or more energy as computing,” says Arm Fellow Rob Aitken. “If you can keep the data where it is and move the compute to the memory, energy cost goes down.” Some have shown that 62.7 percent of total system energy gets spent moving data between main memory and compute.

Other critical benefits - reducing the relative need for internal bandwidth, getting around the problem of limited ‘beachfront’ real estate on chip edges for connections, being able to use energy savings ordinarily consumed in transport for other purposes - flow from the shift in thinking. You can see one taxonomy here.

“The ideas range from adding processors to memory cards to building customized memory instances with compute capability,” he adds. “We don’t know which will succeed, but as a concept, memory centric computing is inevitable.”

Low-power to no-power Devices

“For the Internet of Things (IoT) to achieve its full promise to society and become invisible, pervasive, and sustainable it needs to evolve beyond batteries,” says Arm Distinguished Engineer James Myers.

“Whether that is with flexible solar cells, RF power delivery or sci-fi biobatteries powered by sweat or algae, we’ll make this happen over the next decade.”

One estimate puts the number of batteries we’d need in a world of a trillion devices to be 913 million a day. As the IoT gets embedded into more everyday products, energy efficiency and device self-sufficiency become paramount concerns.

The smart label unfurled by Vodafone and Bayer is an early example of how broad-based IoT might work. The printed label collects information about itself - temperature, location, packing integrity - and then periodically sends it to a central tracking hub. Through precise system and network design, the label can provide updates for three years.

Arm’s Project Triffid takes it one step further. The experimental SoC harvests energy from RF beams to perform calculations and gather data. While not as computationally robust as the smart label, integrating computing capabilities into RFID tags can reduce the costs and management overhead of making things “smart” and therefore open the door to integrating intelligence into far more products. Imagine a shoe that offers consumers $5 for old shoes as part of an effort to achieve 100% recyclability.

Are you going to remember to fill in a form or send in the shoes? Not likely, but what if they created a system where you could drop your shoes in a recycling bin or at a retailer? Someone could flash the chip, confirm the purchase and credit $5 to the credit card account while also cloaking the buyer’s information with appropriate levels of security.

Systems for our systems

Our demand for software is outstripping traditional methods for developing it. IDC estimates 500 million apps and services will be developed by 2023 using cloud-native approaches, as many as were developed 40 years previously.

As time goes on, we will shift from developing applications to developing tools that can develop applications on our behalf. Similarly, a growing portion of chip design and verification will have to be performed through AI-powered applications, particularly for low volume products (i.e., thousands of units) optimised for particular uses.

Lifetime, over-the-air support for devices will similarly become automated. “Over-the-air automated, secure upgrades will have to become the norm,” says Mark Hambleton, Arm’s Vice President of Software. “The smartphone experience, I believe, will become a template for other industries. Unlike the PC, the smartphone world didn’t go through a phase where it trusted users to do the right thing.”

Does this mean the end of human control over our world? Not really. Engineers will ultimately be working on the more complex tasks.

Neural networks everywhere

“We will see Neural Networks continue to replace classical computing methods across a range of industries. What started as fun tech for recognising cats in pictures will have evolved into a technological juggernaut, completely transforming most industries. Consumer electronics, healthcare, law, communications, the automotive industry - all will be transformed as Neural Network technology marches forward, achieving things once thought impossible by machine,” writes Ian Bratt, Arm Fellow and Senior Director of Technology.

“The insatiable demand for neural network compute is already providing the motivation for a new class of processor optimised specifically for neural networks. New processor architectures with tensor level operation abstractions will be present in nearly every computing platform, running the majority of computing cycles. This new class of processor will achieve orders of magnitude efficiency gains over traditional computing platforms, heralding an industry wide shift in the computing landscape. And of course, it will be running on Arm.”