Just4share

Sharing About information technology and sharing all about awesome article

Intel unveils new family of AI chips to take on Nvidia’s GPUs

After the AI boom came a-knocking, Intel had not been planning to answer the telephone. Now, the company is trying to reassert its own power in the silicon business by unveiling a brand new family of chips made particularly for artificial intelligence: that the Intel Nervana Neural Network Processor household, or NNP for quick. The NNP household is meant as a response to the requirements of machine learning, and is destined to receive your own data center, not the PC. Intel's CPUs could still be a stalwart of host stacks (by some estimates, it's a 96 percent market share in data centers), however the workloads of contemporary AI are much better served with the graphical chips or GPUs coming from firms including Nvidia and ARM.

Hence, demand for these firms' chips has skyrocketed. (Nvidia's revenue is up 56 percent year on year.) Google has got in on the action, designing its own silicon named the Tensor Processing Unit to power its cloud computing business, while new firms like the UK-based Graphcore are also rushing to fill the gap.

f:id:just4share:20171119042141j:plain

Intel's response has been to buy up AI hardware talent. It purchased vision specialist Mobileye this March; the chipmaker Movidius (the firm responsible for the silicon in DJI's autonomous drones) last September; and deep learning startup Nervana Systems in August 2016. Since then, it's been busy teasing this line Neural Network Processors, which were previously known under the codename "Lake Crest." The NNP chips are a direct result of its Nervana acquisition and fold in the company's expertise to achieve "faster training time for profound learning models." (Intel claims it also took guidance from Facebook about the chip's layout -- but did not give much detail.)

But how much faster exactly? Intel isn't saying. Even though Google touted the launching of its latest-generation TPU chips by releasing head-to-head tests against equal hardware, so Intel will just say that it's on course to satisfy its objective of improving deep learning training rates from 100 times by 2020. The company is similarly vague on when its NNP chips will soon be available to customers, though perhaps more information will leak out today. Some time prior to the close of the year in small quantities is the anticipation.