When the AI boom came a-knocking, Intel wasn’t around to answer the call. Now, the company is attempting to reassert its authority in the silicon business by unveiling a new family of chips designed especially for artificial intelligence: the Intel Nervana Neural Network Processor family, or NNP for short.

The NNP family is meant as a response to the needs of machine learning, and is destined for the data center, not your PC. Intel’s CPUs may still be a stalwart of server stacks (by some estimates, it has a 96 percent market share in data centers), but the workloads of contemporary AI are much better served by the graphical processors or GPUs coming from firms like Nvidia and ARM. Consequently, demand for these companies’ chips has skyrocketed. (Nvidia’s revenue is up 56 percent year on year.) Google has got in on the action, designing its own silicon named the Tensor Processing Unit to power its cloud computing business, while new firms like the UK-based Graphcore are also rushing to fill the gap.

Intel’s response has been to buy up AI hardware talent. It purchased vision specialist Mobileye this March; the chipmaker Movidius (the firm responsible for the silicon in DJI’s autonomous drones) last September; and deep learning startup Nervana Systems in August 2016. Since then, it’s been busy teasing this line Neural Network Processors, which were previously known under the codename “Lake Crest.” The NNP chips are a direct result of its Nervana acquisition and fold in the company’s expertise to achieve “faster training time for deep learning models.” (Intel says it also took advice from Facebook on the chip’s design — but didn’t give much detail.)

But how much faster exactly? Intel isn’t saying. While Google touted the launch of its latest-generation TPU chips by publishing head-to-head tests against rival hardware, Intel will only say that it’s on track to meet its goal of improving deep learning training speeds by 100 times by 2020. The company is similarly vague on when its NNP chips will be available to customers, though perhaps more details will leak out today. Some time before the end of the year in limited quantities is the expectation.