Intel has just launched their DLIA (Deep Learning Inference Accelerator) PCIe card powered by Intel Aria 10 FPGA, aiming at accelerating CNN (convolutional neural network) workloads such as image recognition and more, and lowering power consumption.

Some of Intel DLIA hardware specifications:

FPGA – Intel (previously Altera) Aria 10 FPGA @ 275 MHz delivering up to 1.5 TFLOPS

System Memory – 2 banks 4G 64-bit DDR4

PCIe – Gen3 x16 host interface; x8 electrical; x16 power & mechanical

Form Factor – Full-length, full-height, single wide PCIe card

Operating Temperature – 0 to 85 °C

TDP – 50-75Watts hence the two cooling fans

The card is supported in CentOS 7.2, and relies on Intel Caffe framework, Math Kernel library for Deep Neural Networks (MKL-DNN), and works with various network topologies (AlexNet, GoogleNet, CaffeNet, LeNet, VGG-16, SqueezeNet…). The FPGA is pre-programmed with Intel Deep Learning Accelerator IP (DLA IP).

Intel DLIA can be used by cloud services providers to filter content, track product photos, for surveillance and security applications for example for face recognition and license plate detection, in the factory to detect defects automatically, and in retail stores to track foot traffic, and monitor inventory.

You’ll find more details including links to get started and the SDK in the product page.