QuickAI™

Artificial Intelligence (AI) & Cognitive Sensing at the Endpoint

 

Introducing a Complete AI Platform Solution

  • Supported by an Ecosystem of partners

  • Enables AI at the Endpoint device



Benefits

  • Includes Sensor Processing platform, Neurons for AI computing, and Application SW for Knowledge building of data and classification

  • Scaling across bigger AI systems

  • No need for in-house expertise of data analytics, DSP Processing, App Coding



Target Applications

  • Industrial IoT

  • Predictive Maintenance applications

  • Vision Inspection System

Endpoint Maker Opportunity

 

 

 

Endpoint Market Opportunity
  • 365M industrial devices to rely on cognitive computing in 2023

  • Video applications to remain predominant, with strong growth in manufacturing and retail

  • Only 25% of devices to rely on the cloud for training, inference, or both in 2023

Addressing the Challenges of Endpoint

  • Organizations working at the endpoint don’t know to how make use of big data, cloud or AI

  • Lack of data scientist and data modeling know-how

  • Diversity of different applications and use cases makes defining model and algorithms for every use case difficult to buy off the shelf

  • Development cost and complexity of managing deployed models on distributed devices

Bringing together key ecosystem partners with various expertise to enable OEMs to develop AI solutions for the Endpoint

QuickAI™ Platform Solution

The new QuickAI platform provides an all-inclusive low power solution and development environment to economically incorporate the benefits of AI in endpoint applications. It features technology, software, and toolkits from General Vision, Nepes, SensiML and QuickLogic, all of which have formed a tightly-coupled ecosystem to solve the challenges associated with the implementation of AI for endpoint applications.

QuickAI

 

  • Application SW to make AI scalable
  • Collecting and training of data
  • Conducts both the learning and the inference
  • Feature Extraction
  • Brings Scaling across applications and customers
  • Cloud-based


  • 576 Neurons
  • 0.1W Power
  • NeuroMem IP – Radial Basis Function (RBF)
  • Concurrent, parallel with scalability (cascade design for additional neurons)


  • Low power voice and sensor SoC
  • Optimal multi-cores (ARM Cortex M4F & FFE) processing
  • eFPGA fabric for feature extraction and interface
  • QuickAI HDK platform for eval/development

General Vision:
The QuickAI platform is based on General Vision's NeuroMem neural network IP, which has been licensed by Nepes and integrated into the Nepes Neuromorphic NM500 AI learning device. Both General Vision and Nepes provide software for configuring and training the neurons in the network. In addition, SensiML provides an analytics toolkit to quickly and easily build smart sensor algorithms for endpoint IoT applications.

General Vision's technology is the foundation element of the platform. This technology enables on-chip exact and fuzzy pattern matching and learning using a scalable parallel architecture of Radial Basis Function neurons. The parallel architecture results in fixed latency for any number of neurons and a very low, energy efficient operating frequency. General Vision supplies the Knowledge Builder tool suite and SDK used to train and configure the neurons in a NeuroMem network.

Nepes:
The Nepes Neuromorphic NM500 implements the NeuroMem technology in an energy efficient, small form factor component. This AI-enabling component can be trained in the field to recognize patterns in real time, and multiple devices can be chained to provide any number of neurons. In addition to the NM500, Nepes supplies the Knowledge Studio software tools used for configuring and training the neurons in the NM500 device.

SensiML:
SensiML complements the General Vision/Nepes technology by providing the SensiML Analytics Toolkit, which simplifies the task of generating endpoint AI solutions by providing tools that automate the management of training data, optimize the choice of feature extraction algorithms, and automate code generation for the resulting AI solution.

QuickLogic:
The EOS™ S3 voice and sensor processing platform from QuickLogic is the final critical element of this new platform initiative. Its ultra-low power, sophisticated audio and sensor processing and embedded FPGA make it the perfect host for the NM500 and the software that implements AI solutions using the NM500.

The platform includes the QuickAI Hardware Development Kit (HDK) with EOS S3, two NM500 devices, accelerometer, gyroscope, magnetometer, microphones, Nordic Bluetooth® Low Energy device, flash memory, and an Intel Edison-compatible connector that allows access to Edison daughter boards such as uSD.

QuickAI HDK Platform

 

Benefits of QuickAI HDK:

  • Demo/Eval/Development /Deployment Platform
  • Time to Development & Time to Market
  • Enables AI (Data Collection, Feature Extraction, Classifier) with Motion, Acoustic & Image Processing
  • Expandable for More NM500s

Application #1 Industrial Vision Inspection

 

Vision Inspection Challenges:

  • Classification of Texture (foods and surface)
  • Adapting quickly to change of materials (color)
  • High-speed Template learning and matching

FPGA Features:

  • Sensor Data -> Feature Extraction -> Vector
  • FFT or MFCC Feature Extraction
  • NM500 Interface

Sensor Enabled Features:

  • Camera Data Analysis

FFE Enabled Features:

  • Ultra-Low Power AON Function Accelerator

Application#2 Industrial Predictive Maintenance

 

Industrial Predictive Maintenance Challenges:

  • Unique Model Doesn’t Scale Across Similar App
  • Pattern Recognition & Classification in Real-Time
  • Power vs. performance

FPGA Features:

  • Sensor Data -> Feature Extraction -> Vector
  • FFT or MFCC Feature Extraction
  • NM500 Interface

Sensor Enabled Features:

  • Vibration (Accel/Gyro) Analysis
  • Audio Signature Analysis

FFE Enabled Features:

  • Ultra-Low Power AON Function Accelerator

Documentation

Have a Question?

If you have questions for our solution experts, please contact us!