Speck Demo kit speeds up deployment of event-based neuromorphic vision applications

1 min read

SynSense, a commercial supplier of ultra-low-power neuromorphic hardware and application solutions, has launched the Speck Demo Kit.

This compact development module enables users to deploy and validate their event-based neuromorphic vision applications and incorporates SynSense’s ultra-low-power dynamic vision Speck module, along with an ultra-low-power Bluetooth controller chip and commonly used peripherals.

The Demo Kit serves as a simplified embedded hardware platform, providing early access for users to test and verify neuromorphic application models and create prototypes.

The kit is the latest verification platform introduced by SynSense, following the high-performance neuromorphic HDK, Speck Devkit and Xylo-Audio Devkit and has been specifically designed to meet the needs of users in industrial and consumer development – the Speck Demo Kit integrates SynSense’s new dynamic vision module and application model resources.

Speck Demo kit contains:

  • 2x Speck modules (with lens focal lengths of 3.62mm and 1.98mm)
  • 1x IOT (BLE) board
  • 1x 150mAh lithium battery
  • 1x Micro USB 2.0 cable

Key Features:

  • Powered by SynSense’s dynamic vision SoC, Speck
  • Equipped with ultra-low-power Bluetooth controller, supporting BLE 5.0
  • Supports both Speck active configuration mode (default) and passive configuration mode with user-stored applications
  • 4×4 LED array and active buzzer for displaying application results directly
  • Supports USB and lithium battery power supply, with built-in lithium battery charging management in the MCU
  • Reserved system reset button, JTAG, UART, and 2 user buttons for supporting embedded system secondary development
  • Reserved (unpopulated) IR receiver/transmitter circuit, RS485 circuit, motor drive circuit, and other peripherals and external connectors

The Speck SoC is a fully integrated, multi-core, single-chip sensor+processor featuring an integrated 128*128 DVS (Dynamic Vision Sensor) imaging array for real-time low-power vision processing for mobile and IOT applications.

It is able to perform intelligent scene analysis at micro-power levels (<10mW) with real-time response (<200ms) and is fully configurable with a network capacity of 0.32 million neurons. The ultra-low-power and ultra-low-latency capabilities of Speck means that it will be able to support always-on IOT and edge-computing applications such as gesture recognition, face and object detection, tracking and surveillance.

Currently more than 100 industry customers, universities and research institutes are using SynSense boards and software to design, build and deploy applications using spiking neural networks.