comment on this article

Lattice enhances sensAI stack for AI at the edge

Lattice Semiconductor has launched the latest version of its complete solutions stack for on-device AI processing at the Edge, Lattice sensAI 3.0.

This version of the stack includes support for the CrossLink-NX family of FPGAs for low power smart vision applications and features customised convolutional neural network (CNN) IP, a flexible accelerator IP that simplifies implementation of common CNN networks and is optimised to further leverage the parallel processing capabilities of FPGAs.

With the addition of support for CrossLink-NX FPGAs, Lattice sensAI brings new levels of power and performance to smart vision applications in the surveillance/security, robotics, automotive, and computing markets.

To address data security, latency, and privacy issues, developers want to move the AI processing that powers their smart vision and other AI applications from the cloud to the Edge.

Most Edge devices are battery-powered or otherwise sensitive to power consumption, so developers need hardware and software solutions that deliver the processing capabilities needed for AI applications, while keeping power consumption as low as possible.

By enhancing the sensAI stack, Lattice said that it was looking to widen the range of power and performance options available to customers. For applications like smart vision that require higher Edge AI performance, CrossLink-NX FPGAs running sensAI software deliver twice the performance at half the power when compared to prior releases of the solutions stack.

Features of the sensAI solutions stack include:

  • New CNN engine IP and compiler support for CrossLink-NXthe stack now supports a customized CNN accelerator IP running on a CrossLink-NX FPGA that takes advantage of the underlying parallel processing architecture of the FPGA. Updates to the NN compiler software tool let developers easily compile a trained NN model and download it to a CrossLink-NX FPGA.
  • CrossLink-NX-based object counting demo – a VGG-based object counting demo operating on a CrossLink-NX FPGA delivers 10 frames-per-second while consuming only 200 mW. Object counting is a common smart vision application used in the surveillance/security, industrial, automotive, and robotics markets.
  • Optimised FPGA architecture for CrossLink-NX – when running on a CrossLink-NX FPGA, the sensAI solutions stack offers up to 2.5 Mb of distributed memory and block RAM and additional DSP resources for efficient on-chip implementation of AI workloads to reduce the need for cloud-based analytics.
  • Up to 75 percent lower power consumption – CrossLink-NX FPGAs are manufactured in a 28nm FD-SOI process that delivers a 75 percent reduction in power in comparison to similar competing FPGAs.
  • High-performance I/O – many components (images sensors, applications processors, etc.) used in smart vision systems require support for the MIPI I/O standard. One of the target applications for sensAI is smart vision, and CrossLink-NX devices are currently the only low-power FPGAs to deliver MIPI I/O speeds of up to 2.5 Gbps. This makes CrossLink-NX FPGAs a suitable platform for sensAI applications requiring MIPI support. CrossLink-NX FPGA’s I/Os offer instant-on performance and are able to configure themselves in less than 3 ms, with full-device configuration in as little as 8ms.
  • Increased neural network architecture support – previous versions of sensAI supported the VGG and MobileNet v1 neural network models. The latest version of the stack adds support for the MobileNet v2, SSD, and ResNet models on the Lattice ECP5™ family of general-purpose FPGAs.


Author
Neil Tyler

Comment on this article


This material is protected by MA Business copyright See Terms and Conditions. One-off usage is permitted but bulk copying is not. For multiple copies contact the sales team.

What you think about this article:


Add your comments

Name
 
Email
 
Comments
 

Your comments/feedback may be edited prior to publishing. Not all entries will be published.
Please view our Terms and Conditions before leaving a comment.

Related Articles